How Much Data Do You Need to Train an AI ?

In this article, we’ll look at ways to determine how much data you need to train an AI model with satisfactory results.

Defining the right amount of data to train an Artificial Intelligence is a tricky venture.

Indeed, it’s complex to know in advance exactly how much data is needed for an AI to accomplish a task.

Nevertheless, by understanding the usefulness of data, by considering all the factors influencing it, and by applying the right strategy, it is possible to clarify the data requirements for an AI project.

Why does AI need data?

An Artificial Intelligence (AI) is an algorithm designed to perform a task. What sets it apart from traditional algorithms is that it learns to solve the task on its own.

This approach is called Machine Learning, because the algorithm learns without human intervention.

To do this, the algorithm needs to be exposed to solutions samples. The more diverse and varied samples it sees, the better it will be at its task.

Let’s say we want to create a facial recognition AI, then we’ll need to have a large variety of photos of faces beforehand.

Indeed, for an AI to recognize a face, it needs to be shown photos of that face represented in a wide range of conditions.

One set of conditions for facial recognition could be :

  • illuminated face
  • face in darkness
  • over-exposed face
  • underexposed face
  • face alone
  • face in a crowd
  • face among other faces

Depending on the needs and requirements of a project, the range of conditions can expand rapidly.

The variability of the face itself must also be taken into account. A face can be photographed in different ways:

  • profile
  • front view
  • high angle
  • low angle
  • etc.

All these photos will have to meet the conditions seen above for the AI to be optimal. Facial variability therefore increases the number of examples required.

In addition, if we want to recognize the faces of multiple people, we’ll have, here again, to take their variability into account and satisfy the set of conditions for all the faces to be recognized.

Sample images from the VGGFace2 dataset – source

Thus, for an AI to be able to perform facial recognition, the number of examples required to obtain satisfactory results can increase exponentially.

So, an artificial Intelligence is an algorithm that automatically learns to solve a task based on samples.

These solutions samples are called “data”.

Data is often grouped together in what is known as a dataset.

It’s crucial to understand what impacts the required size of a dataset.

In addition to the conditions of a project, other factors influence the answer to the question “how much data is needed to train an AI?”

Factors Influencing the Amount of Data Required

The amount of data required to train an AI can vary greatly from one project to another.

Indeed, various factors influence this quantity, and starting an AI project without taking them into account can prove perilous.

To determine how much data is needed to train an AI, you need to consider the following factors:

1. Model Complexity

The complexity of an AI refers to its architecture, such as the number of layers in a neural network or, more generally, the number of parameters.

The most complex models often require a large amount of data to effectively capture the nuances and characteristics of the data.

For example, a neural network for speech recognition will often require more data than a linear regression model for the classification of tabular data.

Thus, the complexity of a model increases the amount of data required for efficient performance acquisition.

2. Task Specificity

Task specificity is the nature of problem the model is designed to solve.

Some tasks are inherently more data-intensive than others.

A simple regression, for example, may require little data, while a more complex task, such as machine translation, may require a substantial dataset, particularly to represent the subtleties of languages.

Consequently, the specificity of the task influences the volume of data required.

3. Data Quality

Data quality refers to their relevance, accuracy and representativeness.

High-quality data can reduce the need for large quantities of data.

Indeed, sharp, high-resolution images are more effective at training a model to solve a facial recognition task than blurred, low-resolution images.

By the way, if your goal is to master Deep Learning - I've prepared the Action plan to Master Neural networks. for you.

7 days of free advice from an Artificial Intelligence engineer to learn how to master neural networks from scratch:

  • Plan your training
  • Structure your projects
  • Develop your Artificial Intelligence algorithms

I have based this program on scientific facts, on approaches proven by researchers, but also on my own techniques, which I have devised as I have gained experience in the field of Deep Learning.

To access it, click here :



Now we can get back to what I was talking about earlier.

Hence, data quality is crucial and can compensate for quantity for some AI models.

4. Data Variability

Variability corresponds to the diversity and range of available data.

The more varied the data, the more effectively the model can learn and generalize.

For example, a speech recognition model needs to be trained with voices of different accents, ages and tones.

For this reason, a high degree of variability in the data is essential for robust model generalization.

5. Desired Performance

Desired performance relates to the expected level of accuracy or efficiency of the model.

Higher performance often requires more data.

A model designed to accurately detect tumors in medical images will require more data than a model designed to simply apply a filter during a selfie.

Performance objectives therefore also define the threshold for the quantity and quality of data required.

6. Computing Capacity

Computing capacity corresponds to the hardware resources available for training models.

Hardware constraints can limit the use of data.

For example, ChatGPT was trained on almost all text pages of the Internet. It would not have been possible to train an AI on this titanic amount of data without OpenAI’s ~25,000 GPUs.

Thus, computing capacity is a limiting factor in the amount of data that can be used.

7. Using Advanced Techniques

Advanced techniques refer to expert methods such as Transfer Learning, Data Augmentation, Semi-Supervised Learning and so on.

These techniques can reduce the need for data.

Transfer Learning, for example, enables a model already trained by an expert on large amounts of data to be adapted to a specific task using less data.

Note: I mentioned Transfer Learning in my Hugging Face article. To consult it, click here.

The use of advanced techniques can thus optimize the need for data and reduce the quantity required.

Each factor presented here plays an important role in the amount of data needed to train an AI.

It’s crucial to take them all into account to strategically determine the right amount of data for your specific case. But there’s one more thing that could be of great use to you…

My Advice for Determining How Much Data You Need to Train an AI

Before starting a business project, it’s a good practice to do a POC (Proof Of Concept).

In addition to assessing the feasibility of a project, a POC also helps determine the resources needed to carry it out.

So, thanks to the POC, a project manager can understand and decide whether a project is feasible and, if so, the strategy and allocation of resources needed to implement it.

Artificial Intelligence projects are no exception to this rule.

I therefore recommend conducting a first experiment by training an AI with a limited amount of data.

Then, depending on the results, and taking into account the factors mentioned above, we’ll be able to decide how much data is initially needed to produce a first version of the product (an MVP).

Once the MVP has been produced and the customer is satisfied with this first version, it is possible to improve the AI by adjusting the amount of data allocated to its training.

Going further – How Much Data to Train an AI

The question “how much data is needed to train an AI model?” is central to Deep Learning. It was asked to me by one of my students in my course Master Deep Learning.

Big data is how much exactly? At what point I could start to use Deep Learning and at what point it is not enough data? I am not Netflix but I want to open a door for my cat, should I pursue Deep Learning or not?

As your student I am looking forward to the next modules.


In this online training course, I offer, among others, a module in which I answer all my students’ questions. I’m 100% available to discuss neural networks with them.

This article is an answer to one of my students’ question, which I’ve decided to make available for free.

If you want to know more, you can access my Action plan to Master Neural networks.

A program of 7 free courses that I’ve prepared to guide you on your journey to master Deep Learning.

Inside I’ll also be presenting my course Master Deep Learning.

Of course, it doesn’t commit you to anything. You’ll be able to simply enjoy quality information that I’ve put together for you – free of charge

If you’re interested, click here:



One last word, if you want to go further and learn about Deep Learning - I've prepared for you the Action plan to Master Neural networks. for you.

7 days of free advice from an Artificial Intelligence engineer to learn how to master neural networks from scratch:

  • Plan your training
  • Structure your projects
  • Develop your Artificial Intelligence algorithms

I have based this program on scientific facts, on approaches proven by researchers, but also on my own techniques, which I have devised as I have gained experience in the field of Deep Learning.

To access it, click here :



Tom Keldenich
Tom Keldenich

Artificial Intelligence engineer and data enthusiast!

Founder of the website Inside Machine Learning

Leave a Reply

Your email address will not be published. Required fields are marked *

This page will not stay online forever

Enter your email to receive for free

The PANE method for Deep Learning

* indicates required


You will receive one email per day for 7 days – then you will receive my newsletter.
Your information will never be given to third parties.

You can unsubscribe in 1 click from any of my emails.

Entre ton email pour recevoir gratuitement
la méthode PARÉ pour faire du Deep Learning

Tu recevras un email par jour pendant 7 jours - puis tu recevras ma newsletter.
Tes informations ne seront jamais cédées à des tiers.

Tu peux te désinscrire en 1 clic depuis n'importe lequel de mes emails.