Amazon
Incidentes involucrados como desarrollador e implementador
Incidente 3435 Reportes
Amazon Alexa Responding to Environmental Inputs
2015-12-05
There are multiple reports of Amazon Alexa products (Echo, Echo Dot) reacting and acting upon unintended stimulus, usually from television commercials or news reporter's voices.
MásIncidente 3733 Reportes
Female Applicants Down-Ranked by Amazon Recruiting Tool
2016-08-10
Amazon shuts down internal AI recruiting tool that would down-rank female applicants.
MásIncidente 1524 Reportes
Amazon Censors Gay Books
2008-05-23
Amazon's book store "cataloging error" led to books containing gay and lesbian themes to lose their sales ranking, therefore losing visibility on the sales platform.
MásIncidente 217 Reportes
Warehouse robot ruptures can of bear spray and injures workers
2018-12-05
Twenty-four Amazon workers in New Jersey were hospitalized after a robot punctured a can of bear repellent spray in a warehouse.
MásAfectado por Incidentes
Incidente 6255 Reportes
Proliferation of Products on Amazon Titled with ChatGPT Error Messages
2024-01-12
Products named after ChatGPT error messages are proliferating on Amazon, such as lawn chairs and religious texts. These names, often resembling AI-generated errors, indicate a lack of editing and undermine the sense of authenticity and reliability of product listings.
MásIncidente 5282 Reportes
Amazon Algorithmic Pricing Allegedly Hiked up Price of Reference Book to Millions
2023-04-08
Amazon's pricing algorithm was implicated in a reference book about flies' unusual high price of millions of dollars, allegedly due to two sellers using the paid service which based their product's pricing on one another's as competitors.
MásIncidente 5751 Reporte
Amazon Rife with Many Allegedly AI-Generated Books of Suspect Quality
2023-06-28
Amazon’s Kindle Unlimited young adult romance bestseller list was flooded with allegedly AI-generated books that made little to no sense, disrupting the rankings. These books were reported to be "clearly there to click farm." Despite being removed from the bestseller list, many remained available for purchase. The incident raised concerns about the integrity of the platform, and the potential financial impact on legitimate authors.
MásIncidents involved as Developer
Incidente 1115 Reportes
Amazon Flex Drivers Allegedly Fired via Automated Employee Evaluations
2015-09-25
Amazon Flex's contract delivery drivers were dismissed using a minimally human-interfered automated employee performance evaluation based on indicators impacted by out-of-driver's-control factors and without having a chance to defend against or appeal the decision.
MásIncidente 4693 Reportes
Automated Adult Content Detection Tools Showed Bias against Women Bodies
2006-02-25
Automated content moderation tools to detect sexual explicitness or "raciness" reportedly exhibited bias against women bodies, resulting in suppression of reach despite not breaking platform policies.
MásIncidents involved as Deployer
Incidente 3954 Reportes
Amazon Forced Deployment of AI-Powered Cameras on Delivery Drivers
2021-03-02
Amazon delivery drivers were forced to consent to algorithmic collection and processing of their location, movement, and biometric data through AI-powered cameras, or be dismissed.
MásIncidente 1162 Reportes
Amazon's AI Cameras Incorrectly Penalized Delivery Drivers for Mistakes They Did Not Make
2021-09-20
Amazon's automated performance evaluation system involving AI-powered cameras incorrectly punished delivery drivers for non-existent mistakes, impacting their chances for bonuses and rewards.
MásEntidades Relacionadas
Microsoft
Incidentes involucrados como desarrollador e implementador
- Incidente 1022 Reportes
Personal voice assistants struggle with black voices, new study shows
- Incidente 5871 Reporte
Apparent Failure to Accurately Label Primates in Image Recognition Software Due to Alleged Fear of Racial Bias
Incidents involved as Developer
Incidentes involucrados como desarrollador e implementador
- Incidente 1022 Reportes
Personal voice assistants struggle with black voices, new study shows
- Incidente 5871 Reporte
Apparent Failure to Accurately Label Primates in Image Recognition Software Due to Alleged Fear of Racial Bias