UX challenge: Teaching a person who is blind to design

It’s been a year since I published my very first article and it was about solving a problem in accessibility. This problem bothered me for a good half of my life. I finally think that I’ve opened a door to its solution.

The Problem

a) Explain what color is to a person who was born blind
b) Help them further, by teaching them how to use that knowledge to design

Years ago I challenged myself to design a systemfor the visually impaired to recognize color, interact with geometric shapes, and use that same technology to perform design work. This is the brief and updated version of the original article, the link of which I’ve posted at the end.


Over 80% of all the information we receive is visual (source).

Admittedly, there are visually impaired artists who create amazing works of art, but nearly all of them were able to see at some point in their lives (for instance see John Bramblitt).

Thus, it is still very hard to convey the idea of color to the majority of people suffering from congenital blindness (blind at birth, like Tommy Edison’s example shown below).

There are, already, various applications that will read out loud and describe an image using AI or image recognition software, but they will not allow for designing your own work (e.g. Microsoft’s Seeing AI).

Similarly, there are braille and full tactile displays, which elevate shapes from a flat surface to trace and mimic imagery; there are also flat haptic displays, that simulate texture and bumps to a touch, via vibrations on their surface. Some even allow to “draw” shapes on them, but none of that technology allows for color recognition and complete control over design tools.

Inspiration (Tommy Edison)


At first, we will attempt to describe color to people who have never seen it, then we need to create the hardware and software to let them use their other senses to know which colors are displayed and where. If they can’t see a picture, then they must feel it!

By simply naming the color and shapes to a user through audible feedback is not enough, since it will require a lot of time to read-out the visual information, instead of providing an instant signal to the brain (this is what current screen readers do).

Hence, we need to make a way for blind users to instantly feel the color. This could be done by a combination of touch and sound.

Touch: Heat-Mapping of an Image

Visually abled people describe color with temperature values — warm or cool, which means that if the users are provided with some sort of a heat map for the imagery (which they can feel by touch), they could instantly get a color value for a particular spot within that image. They can then map out the rest of it by touching all or most of the points of the heat map. This is better described in the video I show at the beginning of the article.


















A few rules and principles about color should also be explained. Users need to be taught what emotions and ideas colors invoke in society, marketing and business. They also need to learn basic principles of color pairings, swatches and theory. After that, they can be trained to design basic layouts for posters, covers and other minimalist, yet beautiful works and progress further into designing more sophisticated products.

Once they have a basic understanding of color usage, they can start feeling it with the new technology. This can be achieved if the hardware has a surface divided into individual thermal points (e.g. 5mm x 5mm), which can independently change their temperature from 40° to 138° F. Then any image can be represented by cool and warm spots on this surface, and simply running their hands across users should get an initial idea of the “mood” or theme of that image.

Sound: Harmonizing & Associating with Colors

In addition to the heat mapping, each color will also have a tune. As the users run their fingers across the thermal map, they will get a general idea of the image by heat first. However, they will not hear a sound tune just yet, in order not to be overwhelmed. The sound can only be heard at one point at a time, meaning that they will need to lift all but one finger, to get the information for that point. This means, that the heat pad will also need to have multi-touch sensitivity (like a touchscreen), in order to tell the computer where the user’s fingers land. If done rapidly, like typing, users should be able to quickly touch all 10 fingers at various points and get color values at those points, by establishing a color map in their mind.

Short audible feedback can be voiced to tell the exact color value in RGB for confirmation. Moreover, the tunes will be assigned to colors based on harmonic principles. Just like certain colors conflict with one another, so do certain notes, tunes, or sounds when they are played together or in sequence.

Thus, we can teach users to recognize good and bad pairings of colors, by mapping and associating specific sounds or tunes to colors in such a way, that when users are pairing colors together and checking it with sound, the overall tune corresponding to that pairing will be harmonious if that color selection was aesthetically pleasing. If not, users will know that it “didn’t sound good” meaning that the color combination is not aesthetically pleasing.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store