By Lilian H. Hill
A lot of uses of
algorithms are beneficial; however, there are dangers involved. Decisions about
admissions, scholarship awards, and hiring have been turned over to algorithms.
Flaws in programming can cost individuals the opportunity to attend their
college of choice because their entrance exams were graded based on a faulty
metric. Hiring decisions can be negatively influenced if metrics involved are
biased against minorities. People’s privacy can be violated if the algorithms
designed to share individual’s information are inaccurate.
In an interesting news
story, Northeastern University installed heat sensor devices undergraduate
students’ desks to track usage (Ongweso, 2022). Given that the students were
enrolled in Northeaster’s Cybersecurity and Privacy Institute, it should not be
surprising that the students detected the presence of the devices, hacked into
them, developed an open-source guide so that other students could hack them.
They then removed the devices and displayed them in an art exhibit spelling the
word NO! The university had installed the devices at night without informing
the students and without Institutional Review Board (IRB) permission. The students found that the devices
were not as secure as the university claimed.
Recommendations
The Center for Democracy
and Technology recommends the following:
- Human beings need to
retain control of decision-making that involves people’s privacy, safety, and
opportunities. Context and nuance are difficult to program into
algorithms.
- Regulate data governance:
Establish policies that determine long information should be kept and under
what certain conditions it should be deleted.
- Conduct regular audits to
ensure that discriminatory outcomes or other unexpected harm do not occur.
- Communicate regularly with
stakeholders to provide feedback and address concerns about the systems that
affect their schools.
- Use algorithms for the
purposes they were designed for. Adapting them to other purposes has the
potential to yield harmful results.
- Foster accountability by
developing plans and policies to identify and correct errors in the
programming. Have strategies and resources available to make amends when errors
have been harmful to people.
- Ensure legal compliance so
that the decisions made by algorithms are fair, accurate, and comply with legal
standards for education.
References
Center for
Democracy and Technology. Algorithmic systems in education: Incorporating
equity and fairness when using student data. Retrieved from https://cdt.org/insights/algorithmic-systems-in-education-incorporating-equity-and-fairness-when-using-student-data/
Ongweso, E.
(2022, December 2). ‘NO’: Grad students analyze, hack, and remove under-desk
surveillance devices designed to track them. https://www.vice.com/en/article/m7gwy3/no-grad-students-analyze-hack-and-remove-under-desk-surveillance-devices-designed-to-track-them