Can AI Language Models Improve Human Sciences Research? A Phenomenological Analysis and Future Directions


  • Marika D'Oria Scientific Directorate, Fondazione Policlinico Universitario A. Gemelli IRCCS



Human-Computer Interaction, Natural Language Processing, Artificial Intelligence, Human Sciences research, Methodology


The article explores the use of the “ChatGPT” artificial intelligence language model in the Human Sciences field. ChatGPT uses natural language processing techniques to imitate human language and engage in artificial conversations. While the platform has gained attention from the scientific community, opinions on its usage are divided. The article presents some conversations with ChatGPT to examine ethical, relational and linguistic issues related to human-computer interaction (HCI) and assess its potential for Human Sciences research. The interaction with the platform recalls the “uncanny valley” phenomenon still known to Social Robotics. While ChatGPT can be beneficial, it requires proper supervision and verification of its results. Furthermore, new research methods must be developed for qualitative, quantitative, and mixed methods.


Alavi, H. S., & Lalanne, D. (2020). Interfacing AI with Social Sciences: The Call for a New Research Focus in HCI. In F. Loizides, M. Winckler, U. Chatterjee, J. Abdelnour-Nocera, & A. Parmaxi (Eds.), Human Computer Interaction and Emerging Technologies: Adjunct Proceedings from the INTERACT 2019 Workshops (pp. 197–202). Cardiff: Cardiff University Press.

Bateson, G. (1972). Steps to an Ecology of Mind. Chicago: University of Chicago Press.

Bateson, G., Jackson, D., Haley, J., & Weakland, J. (1956). Toward a Theory of Schizophrenia. Behavioral Science, 1, 251–264.

Boella, L. (2017). Grammatica del sentire. Compassione, simpatia, empatia. Milano: CUEM.

Bradford, D. K., Ireland, D., McDonald, J., Tan, T., Hatfield-White, E., Regan, T., et al. (2020). ‘Hear’ to Help Chatbot. Co-development of a Chatbot to Facilitate Participation in Tertiary Education for Students on the Autism Spectrum and those with Related Conditions. Final Report. Brisbane: Cooperative Researcher Centre for Living with Autism. Retrieved June 18, 2023 from

Cave, S., & Dihal, K. (2019). Hopes and Fears for Intelligent Machines in Fiction and Reality. Nature Machine Intelligence, 1, 74–78.

ChatGPT Generative Pre-trained Transformer, & Zhavoronkov, A. (2022). Rapamycin in the Context of Pascal’s Wager: Generative Pre-trained Transformer Perspective. Oncoscience, 9, 82–84.

Cheetam, M., Suter, P., & Jancke, L. (2014). Perceptual Discrimination Difficulty and Familiarity in the Uncanny Valley: More Like a “Happy Valley”. Frontiers in Psychology, 5.

Crutzen, R., Peters, G., Portugal, S., Fisser, E., & Grolleman, J. (2011). An Artificially Intelligent Chat Agent That Answers Adolescents’ Questions Related to Sex, Drugs, and Alcohol: An Exploratory Study. Journal of Adolescent Health, 48(5), 514–519.

de Arriba-Pérez, F., García-Méndez, S., González-Castaño, F. J., & Costa-Montenegro, E. (2022). Automatic Detection of Cognitive Impairment in Elderly People Using an Entertainment Chatbot with Natural Language Processing Capabilities. Journal of Ambient Intelligence and Humanized Computing.

Fiske, A., Henningsen, P., & Buyx, A. (2019). Your Robot Therapist Will See You Now: Ethical Implications of Embodied Artificial Intelligence in Psychiatry, Psychology and Psychotherapy. Journal of Medical Internet Research, 21(5).

Giorgi, A. (1970). Psychology as a Human Science. New York: Harper & Row.

Gurcan, F., Cagiltay, N., & Cagiltay, K. (2021). Mapping Human-Computer Interaction Research Themes and Trends from Its Existence to Today: A Topic Modeling-Based Review of past 60 Years. International Journal of Human-Computer Interaction, 37(3), 267–280.

Hertz, N., & Wiese, E. (2019). Good advice is beyond all price, but what if it comes from a machine?. Journal of Experimental Psychology: Applied, 25(3), 386–395.

Infarinato, F., Jansen-Kosterink, S., Romano, P., van Velsen, L., Op den Akker, H., Rizza, F., et al. (2020). Acceptance and Potential Impact of the eWALL Platform for Health Monitoring and Promotion in Persons with a Chronic Disease or Age-Related Impairment. International Journal of Environmental Research and Public Health, 17(21).

Kyriazakos, S., Schlieter, H., Gand, K., Caprino, M., Corbo, M., Tropea, P., et al. (2020). A Novel Virtual Coaching System Based on Personalized Clinical Pathways for Rehabilitation of Older Adults-Requirements and Implementation Plan of the vCare Project. Frontiers in Digital Health, 2.

Lakoff, G., & Johnson, M. (1980). Metaphors We Live By. Chicago: University of Chicago Press.

Lord, S. P., Sheng, E., Imel, Z. E., Baer, J., & Atkins, D. C. (2015). More Than Reflections: Empathy in Motivational Interviewing Includes Language Style Synchrony Between Therapist and Client. Behavior Therapy, 46(3), 296–303.

Maxmen, A. (2018). Self-driving Car Dilemmas Reveal that Moral Choices are not Universal. Nature, 562, 469–470.

Merleau-Ponty, M. (1945). Phénoménologie de la perception. Paris: Gallimard.

Ong, D. C., Zaki, J., & Goodman, N. D. (2020). Computational Models of Emotion Inference in Theory of Mind: A Review and Roadmap. Topics in Cognitive Science, 11(2), 338–357.

OpenAI (2023). Introducing ChatGPT. Retrieved March 6, 2023 from

Osborne, J. W. (2011). Some Basic Existential-Phenomenological Research Methodology for Counsellors. Canadian Journal of Counselling and Psychotherapy, 24(2), 79–91.

Qasem, F. (2023). ChatGPT in Scientific and Academic Research: Fears and Reassurances. Library Hi Tech News, 40(3), 30–32.

Reddit (2022). ChatGPT, Tell Me a Joke About Men & Women. Retrieved March 6, 2023 from

Riegelsberg, J., Sasse, M. A., & McCarty, J. D. (2015). The Mechanics of Trust: A Framework for Research and Design. International Journal of Human-Computer Studies, 62(3), 381–422.

Rizzolatti, G., & Sinigaglia, C. (2008). Mirrors in the Brain. How Our Minds Share Actions, Emotions, and Experience. Oxford: Oxford University Press.

Sardareh, S. A., Brown, G. T. L., & Denny, P. (2021). Comparing Four Contemporary Statistical Software Tools for Introductory Data Science and Statistics in the Social Sciences. Teaching Statistics, 43(51), S157-S172.

Singh, H., Bhangare, A., Singh, R., Zope, S., & Saindane, P. (2023). Chatbots: A Survey of the Technology. In J. Hemanth, D. Pelusi, & J. Chen (Eds.), Intelligent Cyber Physical Systems and Internet of Things (pp. 671–91). Cham: Springer.

Singh, S. (2021). Racial Biases in Healthcare: Examining the Contributions of Point of Care Tools and Unintended Practitioner Bias to Patient Treatment and Diagnosis. Health, Online First, December 7.

Spicer, J., & Sanborn, A. N. (2019). What Does the Mind Learn? A Comparison of Human and Machine Learning Representations. Current Opinion in Neurobiology, 55, 97–102.

Stein, E. (1989). On the Problem of Empathy (3rd revised ed.). Washington (DC): CS Publications.

Stoker-Walker, C. (2023). ChatGPT Listed as Author on Research Papers: Many Scientists Disapprove. Nature, 613, 620–621.

Van Manen, M. (2016). Researching Lived Experience: Human Science for an Action Sensitive Pedagogy (2nd ed.). Abingdon: Routledge.

Wankhade, M., Rao, A. C. S., & Kulkarni, C. (2022). A Survey on Sentiment Analysis Methods, Applications, and Challenges. Artificial Intelligence Review, 55, 5731–5780.

Watzlawick, P. (1983). The Situation Is Hopeless, But Not Serious: The Pursuit of Unhappiness. New York: W. W. Norton & Company.

Wiese, E., & Weis, P. (2020). It Matters to Me if You are Human - Examining Categorical Perception in Human and Nonhuman Agents. International Journal of Human-Computer Studies, 133, 1–12.

Zahour, O., Benlahmar, E. H., Eddaoui, A., Ouchra, H., & Hourrane, O. (2020). A System for Educational and Vocational Guidance in Morocco: Chatbot E-Orientation. Procedia Computer Science, 175, 554–559.

Zou, J., & Schiebinger, L. (2018). AI Can be Sexist and Racist – It’s Time to Make it Fair. Nature, 559(7714), 324–326.




How to Cite

D’Oria, M. (2023). Can AI Language Models Improve Human Sciences Research? A Phenomenological Analysis and Future Directions. Encyclopaideia, 27(66), 77–92.