The Times Australia
Fisher and Paykel Appliances
The Times World News

.

Tech companies are turning to ‘synthetic data’ to train AI models – but there’s a hidden cost

  • Written by James Jin Kang, Senior Lecturer in Computer Science, RMIT University Vietnam

Last week the billionaire and owner of X, Elon Musk, claimed[1] the pool of human-generated data that’s used to train artificial intelligence (AI) models such as ChatGPT has run out.

Musk didn’t cite evidence to support this. But other leading tech industry figures have made similar claims[2] in recent months. And earlier research[3] indicated human-generated data would run out within two to eight years.

This is largely because humans can’t create new data such as text, video and images fast enough to keep up with the speedy and enormous demands of AI models. When genuine data does run out, it will present a major problem for both developers and users of AI.

It will force tech companies to depend more heavily on data generated by AI, known as “synthetic data”. And this, in turn, could lead to the AI systems currently used by hundreds of millions[4] of people being less accurate and reliable – and therefore, useful.

But this isn’t an inevitable outcome. In fact, if used and managed carefully, synthetic data could improve AI models.

Phone running ChatGPT application in front of OpenAI logo.
Tech companies such as OpenAI are using more synthetic data to train AI models. T. Schneider/Shutterstock[5]

The problems with real data

Tech companies depend on data – real or synthetic – to build, train and refine generative AI models such as ChatGPT. The quality of this data[6] is crucial. Poor data leads to poor outputs, in the same way using low-quality ingredients in cooking can produce low-quality meals.

Real data[7] refers to text, video and images created by humans. Companies collect it through methods such as surveys, experiments, observations or mining of websites and social media.

Real data is generally considered valuable because it includes true events and captures a wide range of scenarios and contexts. However, it isn’t perfect.

For example, it can contain spelling errors and inconsistent or irrelevant content[8]. It can also be heavily biased[9], which can, for example, lead to generative AI models creating images[10] that show only men or white people in certain jobs.

This kind of data also requires a lot of time and effort to prepare. First, people collect datasets, before labelling them[11] to make them meaningful for an AI model. They will then review and clean this data to resolve any inconsistencies, before computers filter, organise and validate it.

This process can take up to 80% of the total time investment[12] in the development of an AI system.

But as stated above, real data is also in increasingly short supply[13] because humans can’t produce it quickly enough to feed burgeoning AI demand.

The rise of synthetic data

Synthetic data[14] is artificially created or generated by algorithms[15], such as text generated by ChatGPT[16] or an image generated by DALL-E[17].

In theory, synthetic data offers a cost-effective and faster solution for training AI models.

It also addresses privacy concerns and ethical issues[18], particularly with sensitive personal information like health data.

Importantly, unlike real data it isn’t in short supply. In fact, it’s unlimited.

The challenges of synthetic data

For these reasons, tech companies are increasingly turning to synthetic data to train their AI systems. Research firm Gartner estimates[19] that by 2030, synthetic data will become the main form of data used in AI.

But although synthetic data offers promising solutions, it is not without its challenges.

A primary concerns is that AI models can “collapse”[20] when they rely too much on synthetic data. This means they start generating so many “hallucinations” – a response that contains false information – and decline so much in quality and performance that they are unusable.

For example, AI models already struggle[21] with spelling some words correctly. If this mistake-riddled data is used to train other models, then they too are bound to replicate the errors.

Synthetic data also carries a risk of being overly simplistic[22]. It may be devoid of the nuanced details and diversity found in real datasets, which could result in the output of AI models trained on it also being overly simplistic and less useful.

Creating robust systems to keep AI accurate and trustworthy

To address these issues, it’s essential that international bodies and organisations such as the International Organisation for Standardisation[23] or the United Nations’ International Telecommunication Union[24] introduce robust systems for tracking and validating AI training data, and ensure the systems can be implemented globally.

AI systems can be equipped to track metadata, allowing users or systems to trace the origins and quality of any synthetic data it’s been trained on. This would complement a globally standard tracking and validation system.

Humans must also maintain oversight of synthetic data throughout the training process of an AI model to ensure it is of a high quality. This oversight should include defining objectives, validating data quality, ensuring compliance with ethical standards and monitoring AI model performance.

Somewhat ironically, AI algorithms can also play a role in auditing and verifying data, ensuring the accuracy of AI-generated outputs from other models. For example, these algorithms can compare synthetic data against real data to identify any errors or discrepancy to ensure the data is consistent and accurate. So in this way, synthetic data could lead to better AI models.

The future of AI depends on high-quality data[25]. Synthetic data will play an increasingly important role in overcoming data shortages.

However, its use must be carefully managed to maintain transparency, reduce errors and preserve privacy – ensuring synthetic data serves as a reliable supplement to real data, keeping AI systems accurate and trustworthy.

References

  1. ^ claimed (www.theguardian.com)
  2. ^ similar claims (www.theverge.com)
  3. ^ earlier research (arxiv.org)
  4. ^ hundreds of millions (www.demandsage.com)
  5. ^ T. Schneider/Shutterstock (www.shutterstock.com)
  6. ^ quality of this data (mindkosh.com)
  7. ^ Real data (www.questionpro.com)
  8. ^ spelling errors and inconsistent or irrelevant content (www.technologyreview.com)
  9. ^ heavily biased (guides.library.utoronto.ca)
  10. ^ creating images (theconversation.com)
  11. ^ before labelling them (theconversation.com)
  12. ^ 80% of the total time investment (www.neurond.com)
  13. ^ increasingly short supply (apnews.com)
  14. ^ Synthetic data (blogs.nvidia.com)
  15. ^ generated by algorithms (arxiv.org)
  16. ^ ChatGPT (chatgpt.com)
  17. ^ DALL-E (openai.com)
  18. ^ privacy concerns and ethical issues (www.thehastingscenter.org)
  19. ^ estimates (www.gartner.com)
  20. ^ AI models can “collapse” (www.nature.com)
  21. ^ already struggle (techcrunch.com)
  22. ^ overly simplistic (arxiv.org)
  23. ^ International Organisation for Standardisation (www.iso.org)
  24. ^ International Telecommunication Union (www.itu.int)
  25. ^ high-quality data (www.forbes.com)

Read more https://theconversation.com/tech-companies-are-turning-to-synthetic-data-to-train-ai-models-but-theres-a-hidden-cost-246248

Times Magazine

Can bigger-is-better ‘scaling laws’ keep AI improving forever? History says we can’t be too sure

OpenAI chief executive Sam Altman – perhaps the most prominent face of the artificial intellig...

A backlash against AI imagery in ads may have begun as brands promote ‘human-made’

In a wave of new ads, brands like Heineken, Polaroid and Cadbury have started hating on artifici...

Home batteries now four times the size as new installers enter the market

Australians are investing in larger home battery set ups than ever before with data showing the ...

Q&A with Freya Alexander – the young artist transforming co-working spaces into creative galleries

As the current Artist in Residence at Hub Australia, Freya Alexander is bringing colour and creativi...

This Christmas, Give the Navman Gift That Never Stops Giving – Safety

Protect your loved one’s drives with a Navman Dash Cam.  This Christmas don’t just give – prote...

Yoto now available in Kmart and The Memo, bringing screen-free storytelling to Australian families

Yoto, the kids’ audio platform inspiring creativity and imagination around the world, has launched i...

The Times Features

Here’s what new debt-to-income home loan caps mean for banks and borrowers

For the first time ever, the Australian banking regulator has announced it will impose new debt-...

Why the Mortgage Industry Needs More Women (And What We're Actually Doing About It)

I've been in fintech and the mortgage industry for about a year and a half now. My background is i...

Inflation jumps in October, adding to pressure on government to make budget savings

Annual inflation rose[1] to a 16-month high of 3.8% in October, adding to pressure on the govern...

Transforming Addiction Treatment Marketing Across Australasia & Southeast Asia

In a competitive and highly regulated space like addiction treatment, standing out online is no sm...

Aiper Scuba X1 Robotic Pool Cleaner Review: Powerful Cleaning, Smart Design

If you’re anything like me, the dream is a pool that always looks swimmable without you having to ha...

YepAI Emerges as AI Dark Horse, Launches V3 SuperAgent to Revolutionize E-commerce

November 24, 2025 – YepAI today announced the launch of its V3 SuperAgent, an enhanced AI platf...

What SMEs Should Look For When Choosing a Shared Office in 2026

Small and medium-sized enterprises remain the backbone of Australia’s economy. As of mid-2024, sma...

Anthony Albanese Probably Won’t Lead Labor Into the Next Federal Election — So Who Will?

As Australia edges closer to the next federal election, a quiet but unmistakable shift is rippli...

Top doctors tip into AI medtech capital raise a second time as Aussie start up expands globally

Medow Health AI, an Australian start up developing AI native tools for specialist doctors to  auto...