Google’s new AI-powered dermatology assist tool will identify skin conditions
This tool will cross-check with predefined 288 skin conditions and give users few possible matching conditions
Tech giant Google has come up with an AI model that can identify skin conditions from descriptive text and image input.
As bizarre as Google getting into the medical sector sounds, if this tool can actually do what Google claims, it could possibly identify fatal skin conditions or even early stages of skin cancer.
How Google's dermatology assist tool will work
When this dermatology tool pilot launches later this year, users will be able to, using their phone's camera, upload images of their skin, nail or hair to the web application.
Then the user will be asked to answer some common questions like what symptoms they have, for how long they are facing these issues, etc.
After receiving the information and images of the condition from different angles, Google's AI model will analyse the input and cross-check with predefined 288 skin conditions and give users few possible matching conditions.
Why do we need such a tool?
According to Google, 10 billion Google searches, relating to abnormal skin, hair and nail conditions, are made by internet dwellers every year.
This and the fact that there is an actual global shortage of dermatologists compelled Google to work on this tool.
Besides, there is a limit to how much someone can describe a skin condition for a quick Google search, hoping to be nudged into the right direction.
Limiting factors that can encumber its real-life application
Even though no AI model and machine learning programme can rival Google's, it is not fail-proof.
Even if we assume that Google can identify the skin condition 100% of the time, there is no guarantee that the user will get the right matches every time as the crucial task of taking samples is still in the novice hands of regular smartphone users.
The lighting of the photo alone can throw Google off the trail, leading to a false match.
So, a big part of the process, providing accurate data to Google, is not fail-proof, which means the whole process, even if we consider Google's AI model to be perfect, is not practical.
There is also the fact that many skin conditions require in-person evaluation to be diagnosed.
Google just can not identify those since they need clinical evaluation, hence the false negative.
What this tool means for the future of medical diagnosis
An app to diagnose or identify medical conditions is not a new concept.
Over the last few years, there have been more than a dozen attempts to make a platform where people can simply describe their symptoms and can learn about the condition they have.
But they all use a Q&A model or ask users to describe the symptoms.
Of all such rudimentary apps, Ada – Health App is perhaps the most successful. But unlike Ada, Google's new AI tool will not work on descriptions only.
The dermatology assist tool will take photos into the analysis model as well.
That being said, this tool is merely matching your symptoms to their database and looking if your skin condition is visually similar to what Google has as textbook reference photo models.
This should not be considered a supplement to medical diagnosis, which Google has been very clear about since the announcement.
Medical diagnosis is much complicated and, in most cases, requires sample collection, which an app cannot do.
Drawing from that, it is easier to see more tools like this being developed as Google's one takes off.
But these will only be, as Google puts it, assisting tools.