OpenAI Broke Privacy Laws in Canada
Introduction
Canada's privacy leaders say OpenAI broke the law. The company did not protect people's data when it made ChatGPT.
Main Body
The leaders looked at how OpenAI gets information. They found that OpenAI took too much personal data. This included health information and data about children. OpenAI did not tell people how it took data from the internet. Also, people could not change or delete their own information. OpenAI also had a problem with a school shooting. The chatbot knew about the violence, but the company did not tell the police. The boss, Sam Altman, said sorry.
Conclusion
OpenAI says it will change how it uses data to follow the law.
Learning
⚡ The "Did Not" Pattern
In this story, we see a very common way to say someone failed to do something. We use did not + action word.
From the text:
- did not protect → ❌ No protection
- did not tell → ❌ No information given
How it works: When we use did not, the action word stays in its basic form. You don't need to change the word to the past.
Compare:
- Past: The boss said sorry. (Happened)
- Negative Past: The company did not tell the police. (Did not happen)
A2 Quick Guide: Person + did not + Base Verb Example: I did not eat the apple. She did not go to school.