Resources

Podcasts, Guides, Updates & More

Stack of books
Join our extensive list of clients who have their data privacy under control

Quick links

Blogs Podcasts Whitepapers

Data Protection People Blogs

Data Privacy Learning & Guidance

Our mission is to make data protection easy: easy to understand and easy to do. Our weekly podcasts are available in our Resource Centre along with a collection of articles, white papers, useful guidance, templates, case law, and opinions – providing you with tools you can utilise in your workplace.

When Can You Refuse a Subject Access Request (SAR)?

Understanding When You Can Refuse a Subject Access Request (SAR)

Subject Access Requests (SARs) are a powerful tool for individuals. This grants them the right to access their personal data held by organisations. Yet, compliance with SARs can be challenging for organisations, particularly when requests are burdensome or potentially abusive. While it’s essential to respect individual privacy rights, there are legitimate situations where an organisation can lawfully refuse a SAR. This guide delves into the question when can you refuse a subject access request (SAR)? and the circumstances under which an organisation can deny access and the considerations needed to ensure compliance with the UK GDPR and the Data Protection Act 2018.

“Manifestly Unfounded or Excessive” Requests

The UK GDPR allows organisations to refuse SARs that are “manifestly unfounded or excessive,”. But what does this mean in practice? Let’s break down these terms to understand when a SAR may qualify for refusal under these conditions.

What Makes a Request “Manifestly Unfounded”?

A SAR may be considered “manifestly unfounded” if it’s clear that the requester has no genuine desire to exercise their rights. This could involve situations where the SAR is filed purely to disrupt operations, harass the organisation, or attempt to exert leverage. For example:

  • Example: An ex-employee submits repeated SARs to their former employer with no legitimate need for the data. They do this solely to disrupt the company’s operations. If they demand compensation or a “settlement” in exchange for retracting their SAR, the request may be considered manifestly unfounded.
  • Key Consideration: Simply exhibiting anger or hostility does not automatically make a SAR unfounded. It’s up to the organisation to demonstrate that the individual’s intentions are not genuine.

When Is a Request Considered “Manifestly Excessive”?

A request is “manifestly excessive” if it’s clearly unreasonable in scope or frequency, especially when compared to the benefit or purpose of the request. Factors to weigh include:

  • Repetition: Has the requester already submitted recent SARs that cover the same data?
  • Effort Required: Would the amount of work required to process the SAR far exceed its utility?
  • Example: An individual submits a SAR every two weeks, even though the data held has not changed. This would likely qualify as manifestly excessive.

Potential Changes in SAR Refusal Standards: “Vexatious or Excessive” Requests

As part of ongoing data protection reform, the UK government is considering updating the criteria for refusing SARs from “manifestly unfounded or excessive” to “vexatious or excessive.” This new standard would broaden the range of refusals and potentially offer more protection to organisations:

  • Vexatious Requests: Under the proposed standard, organisations could decline SARs that are designed to harass or cause distress, particularly if they appear to abuse the SAR process.
  • Resource Allocation: Organisations would have more flexibility to refuse SARs based on available resources.

Although the government has not finalised these changes, understanding the current standard allows you to prepare should they take effect.

Legal Exemptions for Refusing a Subject Access Request (SAR)

If a SAR does not meet the “manifestly unfounded or excessive” criteria, there may still be a basis for lawful refusal via exemptions. The UK Data Protection Act 2018 provides a variety of specific exemptions in Schedule 2, which can be used to withhold certain data. It’s essential to apply each exemption thoughtfully to ensure compliance and transparency.

Most Common Exemptions to Consider

  • Prejudice to Law Enforcement or Regulatory Purposes: If fulfilling a SAR could interfere with a police investigation or regulatory enforcement, data may be withheld. For example, an organisation might need to withhold details of an investigation to avoid tipping off the data subject.
  • Confidential Information: Some data is considered confidential due to its context, such as personal references or sensitive communications. For example, confidential references given to an organisation can be exempt from SARs to maintain privacy.

When using exemptions, bear in mind that each data item in the SAR must be evaluated separately. If the majority of the data can be shared without issue, it should be, with any exempt information clearly redacted.

Protecting Third-Party Privacy

The UK GDPR and the DPA 2018 mandate that third-party data is protected when responding to SARs. This requirement is based on the principle that individuals have a right to their own data but not to the data of others unless they have appropriate consent. Here’s how to handle third-party data in a SAR:

  • Identify Third-Party Information: Determine if any documents or communications contain data about individuals other than the data subject.
  • Redact Carefully: Redact all third-party data that isn’t directly relevant or legally permissible to disclose.
  • Balancing Confidentiality and Disclosure: If the data subject already possesses information about the third party or their involvement, this context should be considered before making redactions.

Legal Privilege: When Confidentiality Takes Priority

Legal privilege is a powerful exemption that protects communications made in the context of seeking legal advice or related to ongoing or anticipated legal proceedings. SARs cannot compel the disclosure of privileged information, including:

  • Legal Advice Privilege: Communications between a client and legal advisor for legal guidance.
  • Litigation Privilege: Communications created in anticipation of litigation, which could include records of conversations, emails, and even notes relevant to a potential legal case.

Management Information: Internal Planning and Forecasting

Management information related to strategic planning, such as restructuring efforts or redundancy plans, can often be withheld under the DPA 2018 if disclosing it would compromise organisational goals.

  • Example: If an employee submits a SAR while the organisation is planning redundancies, you may lawfully withhold information related to their potential redundancy as long as revealing this data would jeopardise organisational planning.

Confidential References and Other Employment-Related Information

SARs commonly target information in HR records. Confidential references, however, are typically exempt from disclosure to protect the integrity of employment references.

  • Example: If a former employee submits a SAR requesting a copy of a confidential reference given to a new employer, the organisation can withhold this information.

Exam Scripts and Educational Records

Another specific exemption relates to exam scripts and academic assessments. Students are entitled to their exam results and examiner comments. However, they are not automatically entitled to copies of the exam scripts themselves.

  • Timeframe Considerations: If a student submits a SAR before the results are announced, the organisation has additional time—up to five months or 40 days after the results are released—to respond.

Best Practices for Handling Subject Access Request Refusals

Responding to SARs, particularly when using exemptions, requires a careful and transparent approach. Here are best practices to ensure compliance and minimise risk:

  1. Document Decision-Making: Keep detailed records of your reasons for refusing or partially withholding data. This documentation will be essential in case of regulatory scrutiny.
  2. Apply Exemptions Selectively: Each piece of data within the SAR should be reviewed to determine if it qualifies for disclosure or exemption.
  3. Provide Clear Explanations: If you withhold information, provide a clear but general explanation for each redaction without compromising confidentiality or legal privilege.
  4. Consider Your Resources: If facing an exceptionally large or complex SAR, you may reach out to a data protection consultant to ensure compliance and efficient handling.

Lawfully refusing a SAR can be complex. Requiring a solid understanding of GDPR, UK data protection laws, and your organisation’s data handling policies. By carefully assessing each request wisely, you can meet compliance obligations while protecting organisational resources.

Need Expert Help with SARs?. We are here to guide you through SAR responses and when to refuse a subject access request (SAR). Reach out to simplify your data protection strategy today.

Using AI and Facial Recognition to Determine Age

Using AI and Facial Recognition to Determine Age: Key Implications and Challenges

Artificial intelligence (AI) and facial recognition technologies are constantly continuing to evolve. AI and facial recognition technologies are being used for a surprising range of applications, including determining a person’s age. From verifying age for restricted purchases to protecting minors online, age-detection technology has both compelling benefits and serious concerns. In this article, we’ll explore the potential of using AI and facial recognition for age estimation. Including, the key ethical, legal, and privacy implications that organisations and consumers need to consider.

How AI and Facial Recognition Are Used to Determine Age

AI-powered facial recognition systems analyse images of faces to estimate age by examining facial features. These features include: skin texture, and other markers that typically change over time. These systems rely on large datasets and machine learning models trained to recognise and interpret age-related patterns. Many sectors are testing or deploying this technology. This includes retail, gaming, online platforms, and social media, to verify age and restrict access to age-sensitive content and services.

Applications of Age Detection with AI and Facial Recognition

AI-driven age verification offers a range of applications that could reshape how age-restricted services are delivered:

  • Age Verification in Retail: Many stores are trialling AI facial recognition to confirm age for restricted purchases like alcohol or tobacco. This approach is faster and potentially more reliable than traditional ID checks.
  • Enhanced Safety for Minors Online: Social media platforms and gaming sites can use age detection to prevent minors from accessing inappropriate content. This ensures compliance with regulations like COPPA in the U.S. and the UK’s Age Appropriate Design Code.
  • Content Personalisation and Marketing: Marketers may use age detection to tailor ads to specific age groups. This ensures that campaigns are more relevant and aligned with user demographics.

While the technology offers convenience and enhanced safety, it also comes with significant concerns around privacy, consent, and accuracy.

Ethical and Privacy Implications

Using facial recognition for age estimation raises several ethical and privacy concerns that organisations must address:

  • Privacy Concerns: Facial recognition captures and analyses biometric data, which laws like GDPR consider sensitive. Users may not always know that their facial data is being collected and analysed, raising concerns about consent and transparency.
  • Data Security and Misuse Risks: Biometric data, once collected, is a prime target for cybercriminals. If improperly stored or secured, these data can be vulnerable to breaches, posing serious risks for individuals and organisations alike.
  • Potential for Bias and Inaccuracy: AI models depend on the data they are trained on. Age estimation algorithms vary in accuracy across different demographic groups, which can lead to unfair outcomes if certain age groups or ethnicities are systematically misrepresented.

Legal and Compliance Challenges

The use of facial recognition to determine age is governed by data protection laws that vary by region:

  • GDPR and Biometric Data: In the EU, facial data is classified as biometric data under GDPR, meaning it requires explicit consent to collect and process. Any organisation using AI-based age detection in the EU must ensure compliance with GDPR’s strict consent and data protection standards.
  • U.S. and State-Level Regulations: While the U.S. lacks a federal law specifically governing biometric data, several states, including Illinois, Texas, and Washington, have biometric privacy laws. In California, the CCPA provides privacy protections that apply to AI-driven age verification, especially when organizations collect personal data.
  • UK’s Age Appropriate Design Code: In the UK, the Age Appropriate Design Code (Children’s Code) mandates that online services take steps to protect minors, which may encourage more platforms to adopt age verification but also requires transparency and strict data protection measures.

Balancing Benefits with Responsible Use

For organisations interested in using AI and facial recognition for age determination, careful consideration is essential. Here are a few best practices for responsible use:

  1. Prioritise Transparency and Consent: Always inform users when you collect their data for age verification, and obtain explicit consent, especially for biometric data.
  2. Ensure Accuracy and Fairness: Regularly test AI models to provide consistent, fair, and accurate results across all demographic groups.
  3. Invest in Data Security: Robust data security practices are essential to protect biometric information and prevent unauthorised access.
  4. Consult Legal and Compliance Experts: Compliance with GDPR, CCPA, and other relevant regulations is complex, especially where facial data is concerned. Legal guidance is crucial to ensure compliant data practices.

AI and facial recognition technology offer exciting possibilities for age verification. However, organisations must weigh the benefits against significant ethical, privacy, and compliance challenges. By implementing these practices, businesses can responsibly leverage age estimation technology while respecting users’ rights and building trust.

 

Understanding the complex relationships in data protection

Data Protection Relationships – Understanding the complex relationships in data protection

In the complex landscape of data protection, knowing whether you are a data controller, joint controller, or processor is essential. This clarity ensures that you comply effectively with UK data protection laws.

  • Controllers decide how and why personal data is processed.
  • Processors handle data on a controller’s behalf, with some responsibilities but fewer compliance obligations than controllers.
  • Both controllers and processors must meet specific legal standards. The Information Commissioner’s Office (ICO) empowered to act against both parties for breaches.

For organisations, particularly in fields like political campaigns, understanding these roles is critical, as responsibilities can vary significantly depending on the structure and purpose of data usage.

Understanding Key Roles in Data Protection   

1. Controllers, Joint Controllers, and Processors Defined

  • Controllers are decision-makers, determining the purposes and methods of processing personal data. If multiple controllers share control of the same data for the same purposes, they are considered joint controllers. However, they are not joint controllers if they use the same data for different purposes.
  • Processors act solely on the controller’s instructions, handling data without determining its purpose. They are responsible for aspects like data security and breach notification but do not have as comprehensive compliance obligations as controllers.

Both roles are essential under UK GDPR, and the ICO can enforce penalties against both controllers and processors for non-compliance.

2. Data Controllership in Political Campaigns In political campaigns, controllership roles can vary significantly due to the diverse structures and legal arrangements among political parties, campaign groups, and elected representatives. For example:

  • Political parties might have separate data controllership roles at the national and local levels.
  • Individual candidates or elected representatives may act as controllers independently for activities like constituency work.

Political candidates, campaign groups, and political parties can access the electoral register, each serving as a controller for this data. Any data sharing between elected representatives and party offices requires careful consideration to meet UK GDPR standards, ensuring compliance in all data handling activities.

3. Real-World Controllership Examples in Campaigning

  • Example 1: An independent candidate in a local election compiles a list of supporters and contracts a company to send campaign letters. Here, the candidate is the controller, deciding the purpose and method (sending letters to encourage voting), while the company acts as a processor, executing the task per the candidate’s instructions.
  • Example 2: A political party engages a research company to conduct voter modelling. The party specifies the desired outcome but leaves the methodology to the research firm, which means both the party and the firm become joint controllers, jointly determining aspects of data processing while retaining distinct responsibilities.

Identifying and Defining Data Controllership in Practice

For effective GDPR compliance, organisations should clearly identify who controls each data set and under what circumstances. Mapping data flows and documenting which organisations are responsible for which data can be useful steps. It’s essential to clarify whether data is processed for a shared purpose or not, as this distinction affects the type of relationship (controller vs. joint controller) and compliance obligations.

For more information on mapping and documenting controllership relationships, consider completing a Data Protection Impact Assessment (DPIA).

Establishing and Managing Data Protection Relationships

After determining if you are a controller, joint controller, or processor, it’s crucial to formalise each relationship according to UK GDPR standards:

  • Controller and Processor: This relationship requires a written contract. It needs to outline the processor’s duties and binding them to act solely on the controller’s instructions.
  • Joint Controllers: These arrangements require a transparent agreement outlining each party’s responsibilities under GDPR, even though no formal contract is mandated.

For organisations in politically sensitive areas, it’s also important to remember that the public may not fully understand the nuances of these roles. Ensuring clear and accessible ways for individuals to exercise their data rights can prevent misunderstandings and enhance trust.

The Data Protection Fee

Under the Data Protection (Charges and Information) Regulations 2018, most controllers processing personal data are required to pay a data protection fee to the ICO. There are exceptions for elected representatives, prospective representatives, and House of Lords members. Most political parties and campaign groups, however, are required to pay this fee. Visit the ICO website for full details on payment and exemptions.

By understanding the complex relationships in data protection and documenting your role you can ensure compliance with data protection regulations. Particularly under complex arrangements like political campaigns. Following these best practices enhances accountability, transparency, and compliance with data protection laws. This helps you manage data responsibly and maintain trust with the public.

How Data Protection People Can Help

At Data Protection People, we make data protection simple. Our expert team clarifies your role and responsibilities, ensuring GDPR compliance even in complex setups like political campaigns. With tailored training, audits, and assessments, we help keep your organisation fully compliant. Get in touch today to make data protection easy.

How to Demonstrate Accountability for GDPR Compliance?

Accountability is one of the most significant principles of the UK GDPR. It shows to your clients, key stakeholders and employees that you’re committed to protecting personal data and take data protection seriously. 

But how do you show accountability in practice? Previously, we covered example ways that guide this, and now, we’ll explore each method in detail.

What Is the Accountability Principle?

Under Article 5(2), the GDPR states that organisations are responsible for complying with the six following data protection principles:

  1. Lawfulness, fairness and transparency
  2. Purpose limitation
  3. Data minimisation
  4. Accuracy
  5. Storage limitation 
  6. Integrity and confidentiality

This responsibility falls within the accountability principle, which goes beyond knowing your obligations to being able to prove your compliance.

You need to demonstrate the measures you’ve taken to protect data subjects’ rights and freedoms so they can feel assured their personal data is in safe hands. Meeting this principle will improve overall compliance with the UK GDPR and demonstrate your organisation respects people’s privacy. 

How Do You Demonstrate Accountability?

So, how do you comply with the accountability principle? While there may be no specific way, below, we outline some of the best measures you can take: 

Implement Data Protection Policies

Under the UK GDPR, you must implement data protection policies where appropriate. It’s just one part of the GDPR documentation required to help comply with your legal obligations.  

What policies you may have will vary from business to business, but there are some that should be mandatory across all: 

  1. Data Protection Policy
  2. Privacy Notice
  3. Employee Privacy Notice
  4. Data Retention Policy
  5. Records of Processing Activities

For more guidance, see our latest guide on writing a data protection policy. We outline what needs to be covered and why policies are essential for your business. 

Once the policies are approved, the real work begins. Your data protection officer (DPO) or senior management must ensure your employees know these policies and what is required of them in their day-to-day roles when handling personal data. 

Organise Processor Contracts

If you engage with organisations who handle personal data on your behalf, such as a CRM, you should enter a written binding contract called a Data Processing Agreement (DPA). This agreement outlines the roles and responsibilities of both parties for the processing, holding both accountable in meeting the obligations.

The contract or legal agreement should specify that the processor must follow your documented instructions unless the law requires otherwise and ensure their staff keeps the data confidential. It should also state that the processor must help the controller manage individuals’ rights requests and agree to audits and inspections.

The ICO provides more detail on what to include in your contract. A contract will help key parties understand their obligations and demonstrate good accountability. 

Carry Out & Maintain RoPAs and DPIAs

Assessing and recording your processing activities shows you’re doing everything in line with your accountability obligations. 

Before completing a record of processing activities (RoPA), we recommend a data mapping exercise. A data map draws out what personal data you process, where it comes from and goes, and how you store it. This is the foundation of your RoPA, which documents how and why you’re processing data. Listen to part one and part two of our RoPA roundup for more insight. 

Where processing is considered a high risk to data subject’s rights and freedoms, you should assess potential data protection risks alongside recording your activities. A data protection impact assessment (DPIA) helps identify and minimise these risks from the start of new projects, allowing you to address these risks before they potentially arise. 

This GDPR documentation helps maintain transparency and accountability across your organisation. To maintain this, make it easily accessible so employees can keep it up-to-date and accurate. 

Employ a Data Protection Officer 

Organisations should identify a responsible person to assist in the organisations efforts towards data protection compliance. Where applicable, organisations will hire a data protection officer (DPO) who has specific tasks they should carry out under Article 39 of the UK GDPR. There are certain criteria that is met where appointing a DPO is a legal requirement, you can Discover if you need to appoint a DPO in our blog.

You can hire your DPO in-house or outsource it, the latter being ideal for eliminating conflicts of interest. At Data Protection People, our outsourced DPOs work independently from internal affairs and act solely on behalf of the law. Should you not require one, you should ensure someone is responsible for managing your obligations. 

Schedule Data Protection Training

Now that you have organised all the policies, procedures and measures, you need to implement them. Your employees should receive appropriate data protection training to ensure they are aware of their responsibilities. 

Essential training may include handling subject access requests (SARs), managing personal data breaches or becoming a Data Champion. This training should be regularly refreshed to demonstrate your commitment to data protection. 

Put Security Measures Into Place

Information security protects personal data from falling into the wrong hands. To demonstrate accountability, you must have the following:

  • Policies and procedures for creating, locating and retrieving records
  • Security measures in place for data transfers
  • Procedures for maintaining data quality 
  • A data retention schedule based on your business needs
  • Methods for destroying personal data
  • An information asset register which holds details of all company software and hardware
  • Acceptable Use procedures of software (systems or applications) 
  • An access control policy 
  • Measures for preventing unauthorised access 
  • BYOD and remote working policy
  • Business Continuity plan

Stay Compliant with an Outsourced DPO 

Complying with the accountability principle is not a one-off task. You need to nurture a strong data protection culture in which your employees prioritise data privacy.

Our outsourced DPOs will continuously monitor your compliance, providing expert advice, managing risks and implementing best practices tailored to your business needs. Contact our team to learn more

Data Protection People Podcasts

Data Privacy Learning & Guidance

GDPR Radio – Episode 189

GDPR Radio – Episode 189 – Data Protection News of the Week

Welcome to Episode 189 of GDPR Radio! Last week’s live session was another fantastic discussion with over 100 community members tuning in to keep up to date with the latest news in the world of data protection. GDPR Radio is our bi-weekly series designed to provide fresh insights and cover the latest in GDPR developments, data protection trends, and regulatory updates.

What You Missed in Episode 189

Our hosts took a deep dive into the pressing issues affecting data protection this month. Each episode of GDPR Radio is carefully structured to highlight recent changes in legislation, cover high-profile data breaches, and provide updates on data privacy initiatives both in the UK and across the EU. For Episode 189, we focused on a mix of timely topics, covering both the latest legal changes and practical tips to help organisations stay compliant in today’s fast-evolving privacy landscape.

Joining Our GDPR Radio Community

The GDPR Radio series isn’t just a podcast—it’s an interactive platform where data protection professionals, enthusiasts, and anyone curious about data privacy can connect, ask questions, and share insights. If you’re keen to join our next live session, becoming part of our community is easy. By joining, you’ll gain access to exclusive content, networking opportunities, and direct communication with industry experts. Sessions are held every other week, giving our listeners an ongoing touchpoint with all things GDPR.

Why Tune In?

Each GDPR Radio episode is packed with practical information to help organisations navigate the challenges of data compliance and stay up to date on the most important data protection news. With over 1,300 subscribers and a thriving community, it’s a trusted source for reliable, timely information on the issues shaping data privacy today. Missed an episode? No worries! You can catch up on all past episodes, including Episode 189, on Spotify, Audible, and other major streaming platforms.

Stay connected, stay compliant, and join us on GDPR Radio for expert analysis, in-depth discussions, and a look ahead at what’s next in the world of data protection.

Using Artificial Intelligence In The Workplace

Using Artificial Intelligence In The Workplace: Data Protection Made Easy Podcast, Episode 192

On Friday, we hosted an engaging episode of the Data Protection Made Easy podcast on Using Artificial Intelligence In The Workplace. Joined by a live audience of over 150 data protection professionals, our hosts Jasmine Harrison, Joe Kirk, and Philip Brining brought their data protection expertise to a lively discussion on the potential risks and challenges that AI presents in today’s workplace.

While none of our hosts claim to be AI experts, they explored the practical implications of integrating AI responsibly, covering key legislation, new tools, and global standards. With a balanced perspective, they offered insights without scare tactics, engaging with an audience eager to better understand AI’s evolving role.

Highlights from Episode 192

News of the Week

Before diving into AI, our hosts kicked things off with their popular “news of the week” segment, where they discussed recent developments in data protection, cybersecurity, and compliance. From regulatory updates to emerging best practices, this segment set the stage for a thought-provoking main discussion.

Exploring AI’s Role in the Workplace

After covering current news, Jasmine, Joe, and Philip transitioned into the AI discussion, addressing some of the latest and most critical elements impacting AI’s role in data protection. Here’s a closer look at the three main topics they covered:

  1. The EU AI Act
    The EU AI Act is a first-of-its-kind attempt at regulating AI across the European Union, aiming to ensure that AI development aligns with human rights and data protection principles. Our hosts unpacked how this legislation, although still in its draft stages, will place restrictions on high-risk AI applications, particularly those that involve personal data. They discussed the potential compliance challenges businesses might face, such as navigating AI bias and ensuring AI decisions remain fair and transparent. Although new and complex, the EU AI Act is shaping up to become a major milestone in global AI governance.
  2. Microsoft Co-Pilot and Google Notebook
    AI tools like Microsoft Co-Pilot and Google Notebook have found their way into daily operations, streamlining processes from drafting content to assisting with administrative tasks. However, these tools bring data privacy challenges that data protection teams must address. The hosts highlighted how such AI-driven platforms collect, store, and process data, which could lead to unintended data exposure. With AI tools rapidly becoming workplace staples, Jasmine, Joe, and Philip explored practical ways organisations can limit data risk, maintain transparency, and respect user privacy without stifling innovation.
  3. ISO/IEC 42001: AI Governance Standards
    The conversation then turned to ISO/IEC 42001, a proposed global standard for governing AI management systems, expected to offer a framework for responsible AI deployment. This standard aims to provide a set of best practices for ensuring AI is deployed with privacy, security, and ethical guidelines in mind. For organisations adopting AI technologies, ISO/IEC 42001 could serve as a valuable tool to implement data governance practices that are both ethical and compliant.

Reflecting on the Risks and Rewards of AI

In wrapping up, our hosts emphasised that AI in the workplace presents significant potential alongside its risks. They touched on a range of data protection considerations, from safeguarding personal data to maintaining accountability when using automated tools. Rather than instilling fear, Jasmine, Joe, and Philip encouraged the audience to look at AI as an opportunity for growth—provided it’s used with a strong commitment to data protection and ethical standards.

Listen to the Episode

Couldn’t make it to the live session? No problem! Listen to Episode 192 of Data Protection Made Easy on Spotify, Audible, or any major podcast platform. You’ll gain valuable insights and hear from fellow data protection enthusiasts as they share their views on AI’s impact on the workplace.

Subscribe and Join the Conversation

With AI evolving rapidly, we’re committed to keeping our audience informed. Tune into Data Protection Made Easy every Friday for the latest on data protection trends, regulatory updates, and expert discussions. Don’t miss the opportunity to be part of our thriving community of data protection professionals!

GDPR Radio – Episode 191

GDPR Radio – Episode 191 A Deep Dive into Data Protection News

In Episode 191 of GDPR Radio, the Data Protection Made Easy Podcast takes a critical look at the latest updates and developments in the world of data protection, with a special focus on the ethics and implications of tracking and profiling technologies. Hosted by industry experts Phil Brining, Catarina Santos, and Jasmine Harrison, this episode delivers in-depth conversations around pressing topics that influence the landscape of privacy and security.

Topics Discussed:

  1. Meta’s AI-Enhanced Ray-Ban Glasses: Our hosts examine the ethical challenges surrounding Meta’s latest product, which raises privacy concerns around data collection and surveillance. Are current frameworks like GDPR enough to regulate such innovations?
  2. Updates from the ICO and EDPB: Get the latest news from the Information Commissioner’s Office and the European Data Protection Board, with a spotlight on issues like fingerprint scanning of children in schools and the broader implications for digital consent.
  3. Ransomware Threats and the Casio Incident: Learn more about the rising threats of ransomware, particularly through high-profile cases like the Casio incident. The episode emphasises the crucial role Chief Information Security Officers (CISOs) play in addressing these challenges, stressing that the human element and workplace culture are key defences in cybersecurity.
  4. Deep Fake Technology: Our hosts dive into the social engineering risks posed by deep fake technology, urging organisations and individuals to balance innovation with privacy safeguards to avoid exploitation.

Meet Our Hosts:

  • Phil Brining: Phil brings sharp insights and practical knowledge to the discussion.
  • Catarina Santos: Catarina provides a legal perspective on current events.
  • Jasmine Harrison: Jasmine delves into the societal implications of emerging data technologies.

Join the Data Protection Community

When you join the Data Protection Made Easy community, you get access to a wealth of resources, including:

  • Regular updates on the latest data protection news and regulations.
  • In-depth discussions on GDPR, privacy laws, and emerging technologies.
  • Networking opportunities with professionals from various industries.

Why join?

  • Stay Informed: Be the first to know about the latest trends and regulatory updates in data protection.
  • Expand Your Network: Connect with like-minded professionals and industry experts.
  • Exclusive Content: Enjoy early access to podcasts, webinars, and expert insights.

How to Join:

Joining our community is simple! Just click here to sign up and become part of the conversation. Whether you’re a professional or just starting in the world of data protection, our community has something for everyone.

Listen to GDPR Radio – Episode 191 A Deep Dive into Data Protection News

Digital Footprints: The Ethics and Impact of Tracking and Profiling

Digital Footprints: The Ethics and Impact of Tracking and Profiling

In this insightful episode, our hosts Phil, Joe, and Jasmine dive deep into the complex world of Digital Footprints: The Ethics and Impact of Tracking and Profiling, focusing on its ethical implications and real-world impact.

Discussion Topics:

  • Workplace Monitoring: How companies track employee activities to boost productivity while maintaining a balance with privacy rights. Our hosts explore the fine line between justified surveillance and intrusion.
  • Online Tracking: An in-depth look at the methods used to track online behaviour, such as cookies and targeted advertising. The hosts debate whether it is possible to limit or avoid these types of tracking and what this means for the everyday user.

Meet Our Hosts:

  • Phil Brining: An expert in data privacy and protection, Phil brings a wealth of knowledge and a practical perspective to every discussion.
  • Joe Kirk: Joe offers insightful analysis on how tracking and profiling affect both businesses and consumers.
  • Jasmine Harrison: Jasmine helps break down the legal aspects of data protection and ethical monitoring, making complex topics accessible.

About the Podcast:

The Data Protection Made Easy Podcast is your go-to source for understanding the world of data protection, privacy laws, and ethical considerations. Each week, we delve into timely topics that matter to businesses, professionals, and individuals concerned about their digital footprint. Whether you’re a data privacy novice or a seasoned professional, our episodes are crafted to be both informative and engaging.

Join Our Community:

By becoming a part of the Data Protection Made Easy community, you gain exclusive access to insightful discussions, industry updates, and practical tips on safeguarding your data. You’ll also connect with like-minded professionals and stay ahead of key trends in data privacy.

Benefits of Joining:

  • Access to exclusive webinars and Q&A sessions with experts
  • Early access to new podcast episodes and special content
  • Networking opportunities with data protection professionals
  • Insightful resources on the latest developments in data protection and privacy

How to Join:

Joining is easy and free. Simply subscribe to the Data Protection Made Easy podcast on your favourite platform, such as Spotify, or participate in our live weekly sessions on Microsoft Teams. Stay informed, expand your network, and never miss out on critical updates in data protection.

Data Protection People Whitepapers

Data Privacy Learning & Guidance

How to Respond to a Data Subject Access Request (DSAR) 

Read about how to properly handle a Data Subject Access Request (DSAR) as a data controller at an organisation who has received a request.

Do I need to do a DPIA?

Learn about Data Protection Impact Assessments (DPIAs) and how to manage them.

Data within Education

Data within Education Having joined Data Protection People as a graduate fresh from finishing Leeds Beckett University, my knowledge of GDPR and data protection was virtually non-existent, I was well and truly thrown in the deep end. You could say it was like learning how to run before I could walk. Luckily alongside having to…

Outsourced Consultant Versus In-House?

Do I need to do a DPIA? Whenever you implement a new processing activity, system, or process, you should consider whether a DPIA is needed. This should be done as early as possible in the process to allow time for the implementation of risk mitigation. Step One: is a DPIA legally required? The first thing…

Join our community

Our mission is to make data protection easy: easy to understand and easy to do. We do that through the mantra of benchmark, improve, maintain.

Join the Data Protection Made Easy podcast or Sign Up For Newsletters
Hidden
This field is for validation purposes and should be left unchanged.