Data in support of article "Can robots express facial emotions dominantly enough for use in dementia care?", authored by Evgenios Vlachos and Zheng-Hua Tan and published in 2020 by the journal of International Psychogeriatrics, Cambridge University Press. Our objective is to evaluate the recognition, and denomination of the six basic emotional facial expressions as displayed by a social robot to perons with dementia, and to compare it with the results from the evaluation of static photographs of humans from the Paul Ekman database in order to investigate the differences in recognition rates among the two stimuli. It is a cross sectional study following the between-subjects approach. Files: - Figure. The humanoid robot iSocioBot portraying the six basic facial expressions: Happiness/Anger/ Fear (top line left to right), Sadness/Disgust/Surprise (bottom line left to right). - Table. Confusion Matrix of the recognition rates in percentages on how the signaled facial expressions are confused with the recognized ones for both the robot and human stimuli (the highest values are typeset in boldface). Signaled are the emotions the researcher is attempting to show to the user via a medium. - Method. Robotic Apparatus: iSocioBot, http://socialrobot.dk/ Work supported by Grant DFF – 1335-00162.