The Internet of Things (IoT) is ever growing. This interconnected network of devices, vehicles, appliances, and other physical objects which are embedded with sensors, software, and network connectivity has entered into consumers’ lives through smart products, devices and appliances.
The smart functionality of these devices is predicated on the processing of data, including personal data gathered from users. This has led to concerns about the risks to consumers. Equally for the manufacturers and developers of these devices there are significant risks if they fail to comply with laws concerning privacy and use of personal data.
This article looks at the key principles which smart product manufacturers, developers and service providers must follow when processing personal data.
ICO guidance
On 16 June 2025, the Information Commissioner’s Office (ICO) published guidance for organisations who process personal information gathered from consumer IoT products and associated services (such as an app), and aims to provide regulatory certainty.
Lawful basis for processing
There must be a legitimate reason for collecting and processing personal data. Lawful bases for this are consent; performance of a contract; legal obligation; vital interests; public task; and legitimate interests.
Many IoT devices process personal data in order to interact with the user, persons connected to the user via social media and with other services. The correct lawful basis will depend on the purpose for which the data is collected, and the relationship between the controller and the data subject.
Special category data
As so many IoT products are designed for personal health or wellbeing purposes, much of the data gathered by IoT devices will be ‘special category’ data, for example data relating to an individual’s health, sexual orientation or religious beliefs. Special category data is subject to heightened legal protection; therefore organisations which collect such data through smart devices must be able to show that:
- they have a lawful basis to collect the special category data under Article 9 GDPR – which for consumer products will typically be the user’s freely given and informed consent.
- their use of special category data is necessary and proportionate, and is the minimum amount required to fulfil the particular and defined purpose.
Other data may be collected which is not special category, such as location data, but may create harm for the individual if disclosed unlawfully. Organisations should be alert to such risks when assessing whether such personal data can be lawfully collected, and how risks can be mitigated.
Consent
If consent is being relied upon as the lawful basis for processing personal data, a user’s consent must be express, informed and freely given for the exact purpose for which the data is collected. Automatic ‘opt-in’ choices or bare consent being required for the device simply to function, are invalid.
Accountability
Being accountable means being able to demonstrate compliance with data protection law. For an organisation this means keeping records of governance and decision making, identifying and assessing risk related to data processing, and showing that the technical and organisational measures necessary to ensure the processing is compliant with law, are being taken.
Fairness and transparency
Users should be notified of what data will be collected, the purposes for which it is collected, how it will be used, and by whom, before they provide the data or give consent to process it. Privacy notices must be kept up to date and provided to users whenever there are any changes to how their data may be processed.
The fairness principle means that personal data should be not be used in a way that individuals do not expect, for example data being used in a smart watch to track their fitness and sleep should not be used for marketing or to train AI. Manufacturers and developers of smart products should keep their data processing under regular review, to ensure data is not being used in a way that could be harmful to the individual users. Use of AI to process personal data is cited as a particular risk, due to the potential for ‘hallucination’ (described as occurring, “when an AI database generates fake sources of information” (See Wadsworth v. Walmart Inc., 348 F.R.D. 489, 493)) and/or bias, leading to possibly harmful outcomes.
Special consideration must be taken for the risk to children. Devices such as smart speakers often sit in rooms accessible to children and are often used unsupervised. Children under 13 are unable to consent to the processing of their data. Therefore manufacturers should give careful consideration to the risk of their devices being used by this user-group. Requirements for parental consent for features that will require data processing should be put in place where products are marketed to children or may be used by them.
Accuracy and security
Organisations have a duty to ensure that the personal data they hold is accurate, errors are corrected, and data is regularly updated. Where sensors are used, they must be capable of accurately discerning different types of information.
Personal data must be processed securely to prevent it from being destroyed or disclosed unlawfully, and to avoid harm to the rights and freedoms of the individual data subjects. This includes an obligation not to share personal data with any person or organisation who cannot provide adequate levels of protection and privacy. The security required will be different for data processed within the product, compared to data processed on external servers or the cloud. The ICO guidance outlines various steps organisations should take, which includes:
- requiring unique and complex passwords and, where possible, using multi-factor authentication;
- monitoring IoT products for security threats and vulnerabilities and carrying out regular security updates;
- using encryption to protect data, and;
- publishing a policy for users to report issues.
Privacy enhancing technologies - for example synthetic data, or sophisticated cryptographic methods - may be considered as part of the data security armoury, but the ICO warns that these technologies do not replace organisations’ fundamental obligations under data protection principles and the obligation to maintain the security of the data entrusted to them.
Retention of personal data
Data should not be kept for longer than necessary. There are no set time limits for storing personal data, and therefore organisations must themselves determine what is appropriate. Once determined, retention periods should be followed and data securely erased or anonymised when it is no longer required.
Individual Rights
IoT manufacturers and developers must enable their individual users to exercise their data subject rights including the right to:
- Access and take a copy of their data.
- Have their data corrected or erased.
- Transfer a readable file of their data to another person.
The rights available depend on the lawful basis for processing the data. Organisations must inform users how they can exercise their data subject rights, and put in place the infrastructure to enable this. This could be through a dedicated data privacy team or by having settings in the device which allow the user to make a request.
Particular care must be taken when IoT devices are used for automated decision making, including profiling. Automated decision making must only take place if it is necessary to perform a contract, authorised by law, or if the data subject provides explicit consent.
Differences in UK and EU data protection laws
Although the ICO’s IoT guidance applies in the UK, the similarities between UK and EU data protection laws mean that this guidance can be a useful document for businesses in both the UK and EU markets. However, manufacturers should be conscious of the differences in the regimes.