UK Watchdog Urges Tech Industry to Embed Data Protection in AI Development, amid Rising Data Breaches

Picture this; you’ve just started your own business. You have a team, a search engine marketing strategy and a killer website - now you must tackle a content marketing strategy.

Over 3,000 cyber breaches were reported to the ICO, which regulates the collection and use of personal data over 2023

Tech businesses must “bake in” data protection at each and every stage of the development of artificial intelligence (AI) technologies in order give the upmost protection to people’s personal information, warned the UK’s data watchdog chief.

Watchdog found that when AI utilises personal data, it falls under the scope of existing data protection and transparency laws. This includes the use of personal data for training, testing, or deploying AI systems.

John Edwards, UK information Commissioner will warn an audience of tech leaders “As leaders in your field, I want to make it clear that you must be thinking about data protection at every stage of your development, and you must make sure that your developers are considering this too.”, as part of a speech about privacy, AI and emerging technologies.

Sachin Agrawal, Managing Director,  Zoho UK said: “As AI continues to revolutionise business operations, it is crucial that data protection is embedded by design. Companies should implement data protection at every stage of AI development, ensuring privacy is “baked in” to protect both internal and customer data.

According to Zoho’s Digital Health Study, 36 per cent of UK businesses surveyed said that data privacy plays a critical role in the success of their business. However, only 42 per cent of respondents say they comply with all regulations and industry guidelines. This discrepancy shows a need for increased education so businesses can be more responsible with customer data protection in all aspects of data, not just AI.

Commercial exploitation of customer data is commonplace in the industry, but we would argue it is unethical. We believe a customer owns their own data, not us, and only using it to further the products we deliver is the right thing to do. This approach ensures compliance with legislation and builds trust, which strengthens customer relationships.

The demand for ethically-driven data practices is expected to grow even stronger as the use of AI accelerates. Businesses who do not centre their policies around the customer’s best interests, might find customers looking elsewhere for a more responsible alternative.”

Read more:
UK Watchdog Urges Tech Industry to Embed Data Protection in AI Development, amid Rising Data Breaches