Computing Bias
Popcorn Hack #1
Voice recognition software often struggles with non-native English speakers, who are negatively impacted. A key cause is the inadequate diversity of training data, which overrepresents native accents and underrepresents others.
Popcorn Hack #2
I once tried to turn off my Alexa alarm, but it wouldn’t recognize my commands—it kept ringing while I repeated the request multiple times. It was frustrating to feel ignored. One improvement could be to enhance the wake-word sensitivity and sheer accuracy for diverse accents or intonation so it consistently understands a range of voice inputs.
Popcorn Hack #3
Bias can creep in if the app assumes all users have the same fitness levels or goals, neglecting factors like mobility issues or age-related constraints. To avoid this, allow users to set personalized goals, offer varying difficulty levels, and include adaptive workout plans that consider different health conditions. Ensuring diverse representations in your training data and user testing can also help make recommendations fair and inclusive.
Homework Hack #1
I regularly use a streaming service that often shows me certain genres repeatedly, even if I’ve shown interest in a broader variety. This bias could stem from an algorithm trained predominantly on popular movies and shows, overlooking niche interests or specific needs. A possible fix is to expand data collection to emphasize lesser-seen content and conduct targeted user testing with diverse audiences, ensuring more balanced recommendations for all.