Abramson, P. R., & Inglehart, R. (1995). Value change in global perspective. Ann Arbor, Mich: University of Michigan Press.
Amin, K. S., Forman, H. P., & Davis, M. A. (2024). Even with ChatGPT, race matters. Clinical Imaging, 109, 110113.
Baert, S., De Pauw, A. S., & Deschacht, N. (2016). Do employer preferences contribute to sticky floors?. ILR Review, 69(3), 714-736.
Barocas, S., & Selbst, A. D. (2016). Big data’s disparate impact. California Law Review., 104, 671.
Bertrand, M., & Mullainathan, S. (2004). Are Emily and Greg more employable than Lakisha and Jamal? A field experiment on labor market discrimination. American economic review, 94(4), 991-1013.
Chander, A. (2016). The racist algorithm. Mich. L. Rev., 115, 1023.
Dastin, J. (2018, October 11). Insight - Amazon scraps secret AI recruiting tool that showed bias against women. Reuters. link
Dator, James Allen, Richard Pratt, et Yongseok Seo. Fairness, globalization, and public institutions : East Asia and beyond. Honolulu : University of Hawaiʻi Press. 2006.
De-Arteaga, M., Romanov, A., Wallach, H., Chayes, J., Borgs, C., Chouldechova, A., … & Kalai, A. T. (2019). Bias in bios: A case study of semantic representation bias in a high-stakes setting. In proceedings of the Conference on Fairness, Accountability, and Transparency (pp. 120-128).
Drage, E., & Mackereth, K. (2022). Does AI debias recruitment? Race, gender, and AI’s “eradication of difference”. Philosophy & technology, 35(4), 89.
Dwork, C., Hardt, M., Pitassi, T., Reingold, O., & Zemel, R. (2012, January). Fairness through awareness. In Proceedings of the 3rd innovations in theoretical computer science conference (pp. 214-226).
Fabris, A., Baranowska, N., Dennis, M. J., Graus, D., Hacker, P., Saldivar, J., … & Biega, A. J. (2025). Fairness and bias in algorithmic hiring: A multidisciplinary survey. ACM Transactions on Intelligent Systems and Technology, 16(1), 1-54.
Fernández-Reino, M. & Brindle, B. (2024) Migrants in the UK labour market: an overview. Migration Observatory briefing, COMPAS, University of Oxford
Garg, N., Schiebinger, L., Jurafsky, D., & Zou, J. (2018). Word embeddings quantify 100 years of gender and ethnic stereotypes. Proceedings of the National Academy of Sciences, 115(16), E3635-E3644.
González, M. J., Cortina, C., & Rodríguez, J. (2019). The role of gender stereotypes in hiring: A field experiment. European Sociological Review, 35(2), 187-204.
Houser, K. A. (2019). Can AI solve the diversity problem in the tech industry: Mitigating noise and bias in employment decision-making. Stan. Tech. L. Rev., 22, 290.
Iso, H., Pezeshkpour, P., Bhutani, N., & Hruschka, E. (2025). Evaluating Bias in LLMs for Job-Resume Matching: Gender, Race, and Education. arXiv preprint arXiv:2503.19182.
Joseph, M., Kearns, M., Morgenstern, J. H., & Roth, A. (2016). Fairness in learning: Classic and contextual bandits. Advances in neural information processing systems, 29.
Kidder, L. H. (1986). There is no word for “fair”—Notes from Japan. In International Conference on Social Justice in Human Relations, Leiden University, Netherlands.
Kidder, L.H., Muller, S. (1991). What Is “Fair” in Japan?. In: Steensma, H., Vermunt, R. (eds) Social Justice in Human Relations. Critical Issues in Social Justice. Springer, Boston, MA. doi link
Kim, Tae-Yeol et Kwok Leung. (2007) « Forming and reacting to overall fairness: A cross-cultural comparison », Organizational behavior and human decision processes, vol.104 no 1. p. 83‑95.
Köchling, A., & Wehner, M. C. (2020). Discriminated by an algorithm: a systematic review of discrimination and fairness by algorithmic decision-making in the context of HR recruitment and HR development. Business Research, 13(3), 795-848.
Kordzadeh, N., & Ghasemaghaei, M. (2021). Algorithmic bias: review, synthesis, and future research directions. European Journal of Information Systems, 31(3), 388–409. doi link
Kübler, D., Schmid, J., & Stüber, R. (2018). Gender discrimination in hiring across occupations: a nationally-representative vignette study. Labour Economics, 55, 215-229.
Kumar, S. H., Sahay, S., Mazumder, S., Okur, E., Manuvinakurike, R., Beckage, N., … & Nachman, L. (2024). Decoding biases: Automated methods and llm judges for gender bias detection in language models. arXiv preprint arXiv:2408.03907.
Kuncel, N. R., Klieger, D. M., & Ones, D. S. (2014). In hiring, algorithms beat instinct. Harvard business review, 92(5), p32-32.
Liu, Y., Radanovic, G., Dimitrakakis, C., Mandal, D., & Parkes, D. C. (2017). Calibrated fairness in bandits. arXiv preprint arXiv:1707.01875.
Mahler, I., Greenberg, L., & Hayashi, H. (1981). A comparative study of rules of justice: Japanese versus American. Psychologia: An international journal of psychology in the orient.
Nemanick, Jr, R. C., & Clark, E. M. (2002). The differential effects of extracurricular activities on attributions in resume evaluation. International Journal of Selection and Assessment, 10(3), 206-217.
Petit, P. (2007). The effects of age and family constraints on gender hiring discrimination: A field experiment in the French financial sector. Labour Economics, 14(3), 371-391.
Quillian, L., & Midtbøen, A. H. (2021). Comparative perspectives on racial discrimination in hiring: The rise of field experiments. Annual Review of Sociology, 47(1), 391-415.
Raghavan, M., Barocas, S., Kleinberg, J., & Levy, K. (2020, January). Mitigating bias in algorithmic hiring: Evaluating claims and practices. In Proceedings of the 2020 conference on fairness, accountability, and transparency (pp. 469-481).
Rivera, L. A. (2011). Ivies, extracurriculars, and exclusion: Elite employers’ use of educational credentials. Research in Social Stratification and Mobility, 29(1), 71-90.
Rooth, D. O. (2021). Correspondence testing studies. IZA World of Labor.
Saxena, Nripsuta Ani, Karen Huang, Evan DeFilippis, et al. (2020) « How do fairness definitions fare? Testing public attitudes towards three algorithmic definitions of fairness in loan allocations », Artificial intelligence, vol.283. p. 103238‑15.
Scarborough, J. (1998). The origins of cultural diVerences and their impact on management. London: Quorum Books
Thomas, C.W., Cage, R.J., and Foster, S.C., (1976). Public opinion on criminal law and legal sanction: an examination of two conceptual models. The journal of criminal law and criminology, 67 (1), 110–116. doi:10.2307/1142462
Veldanda, A. K., Grob, F., Thakur, S., Pearce, H., Tan, B., Karri, R., & Garg, S. (2023). Are Emily and Greg still more employable than Lakisha and Jamal? Investigating algorithmic hiring bias in the era of ChatGPT. arXiv preprint arXiv:2310.05135.
The Migration Observatory. (2024, June 10). Migrants in the UK labour market: An overview. Migration Observatory