Stereotyping based on race, origin, profession, religion, and nationality has been an unfortunate persistence in human society. Although stereotyping a person is not the correct way to determine someone’s professional output, it is an unconscious action that almost everyone had done at one point or another. In fact, we occasionally make unconscious decisions based on bias. For example, when mentioning an entry level computer programmer working in Silicon Valley, many of us will think of young, white men in glasses typing at their computers instead of black coders. In the US, this stereotype may come from the fact that there aren’t many African American programmers. As result, human resource departments of corporations may not trust black applicants for programming jobs. This is what Stephanie Lampkin set out to change - not letting someone with skills being hold back due to stereotypes and bias. She also believes the stereotype of black people that are unable to be programmers will disappear if there is an increase of black programmers.
“People are making unconscious choices all the time. In hiring, two identical resumes and the only difference is one name is Joe and Jose can make a big difference.” – Stephanie Lampkin
In a study conducted by the National Bureau of Economic Research, generic “white-sounding” names like Emily or Greg are more likely to receive call back for interview than generic “black-sounding” names like Jamal and Lakisha. Researchers had sent out 1300 similar resumes depending on job requirements, industry, and experience needed. As result, white sounding names received one call back in every ten applications, while black sounding names received one call back in every fifteen applications, despite similar skill sets and experiences. It may mean discrimination exists to a small degree throughout the hiring process even when employers are looking at the name of the resume. Although this research won’t show interview success rate nor salary offered since all of the sent resumes were fake, it guaranteed that if someone doesn't have a chance to interview then they will never have the job.
Graduating from the MIT Sloan School of Management’s MBA program in 2013, Stephanie Lampkin founded Blendoor in 2014. Lampkin believes that work place diversity (both gender and race) can make work more productive and efficient. In addition, Stephanie has a strong technical background in programming. When she graduated from Stanford with a BS degree in Management Science and Engineering, she was interviewing with a well-known firm in Silicon Valley, but got rejected in her eighth interview due to her “background not being technical enough.” Later, she ended up in being a Technical Account Manager for Microsoft for five years. Afterward, she worked for TripAdvisor while working for her MBA degree. Despite this, she still wondered the true reason of that rejection from a few years ago - if being a young black woman had any impact.
Blendoor is a mobile career match application. Job seekers will post their resume onto the app while companies and recruiters register with Blendoor and post jobs onto it. Job applicants can then search and apply for the job they wish to have. When job applicants apply for jobs, the algorithm will hide the applicants’ profile pictures and names; the recruiter will only see nameless resumes. In other words, recruiters will not know who the applicants are when looking at the received resumes, but only their education, working experiences, and skills to determine the qualification. If the recruiter likes the resume, then the recruiter can start conversation with the job applicant and even arrange potential interviews in the future. If the recruiter doesn’t like the applicant, then they simply reject the application. Such hiring based on merit may reduce the amount of discrimination and bias based on names and potentially increase the diversity of talent. In an interview with MadameNoire, a lifestyle website featuring black women, Lampkin said she sees Blendoor as a default hiring app since name, gender, and race should not be a consideration in the process.
Christina Scanlon, user interface designer of Blendoor, said this app is designed to be like listening to an orchestra without seeing the person. The audience can’t see who’s performing or know the demographic detail of the orchestra. The only thing the audience can determine is the quality of music played.
This app could potentially prevent recruiters and companies from making biased decisions based on the applicants’ gender, cultural background, or race. However, the demographic information will inevitably be shown to the hiring companies as the applicants walk into the job interviews. In addition, most of the companies that use this app for their hiring most likely don’t believe in racism or stereotyping ideologies in the first place. Nevertheless, even this app can help an applicant on getting an interview, but the interviewer still retain biased opinions after the fact. The reason may be “not fitting in to corporate culture,” “not passing a test in the interview,” or “having more qualified applicants” in the rejection email. It could be like how Ms. Lampkin got rejected after her eighth interview with that prestigious tech firm possibly due to her race and gender. In addition, human resources with biased opinions on certain races, genders, or cultural backgrounds are still able to hire people in mainstream hiring platforms like LinkedIn, university websites, Glassdoor, Indeed, or even Craigslist. In the worst-case scenario, this app only slightly increases the chance of minority applicants interviewing with recruiters that may possess biased opinions toward their background. Whether or not they will actually receive the jobs cannot be determined by this app.