Blog

  • GDPR Consent vs. Legitimate Interest: A Guide for Small Businesses & NGOs

    What is a Legal Basis under GDPR?

    A legal basis is the foundation that allows you to process personal data under the GDPR. Every organization—whether a small business, a freelancer, or an NGO—must be able to point to one specific legal basis for each purpose of processing. Without a valid legal basis, the processing is simply unlawful, no matter how harmless it may seem.

    The challenge is that not all legal bases are created equal. Some are stable and practical; others are fragile and easy to get wrong. Consent is the one most people know, but it is also the one most often misused. To understand why, it helps to look at how consent works in the real world, through the eyes of three very typical organizations:

    • a small yoga studio offering online classes,
    • a local animal‑rescue NGO coordinating volunteers,
    • and a neighborhood bakery running a simple loyalty program.

    Why Consent is Often Misused

    The yoga studio is the classic example of where consent works well. They send a weekly newsletter with class updates, wellness tips, and occasional promotions. This is a genuinely optional service. People can sign up if they want, ignore it if they don’t, and unsubscribe at any time without affecting their relationship with the studio. Consent here is meaningful: the user has a real choice, the purpose is clear, and withdrawal doesn’t break anything. A simple opt‑in checkbox is enough, and the studio can easily explain what the emails contain. This is what GDPR‑compliant consent looks like in practice—clean, transparent, and low‑risk.

    Things become more complicated when we look at the animal‑rescue NGO. They collect volunteer information—names, phone numbers, availability, emergency contacts—so they can coordinate shifts and ensure safety. Many NGOs assume they need consent for this, so they add a checkbox to their volunteer form. But this is where consent becomes a liability. If a volunteer withdraws consent, the NGO would have to stop using their information immediately, even if that means they can no longer contact them during an emergency or schedule them for an event. The NGO needs this information to function. Consent is the wrong legal basis here because the processing is necessary for the volunteer relationship. Withdrawal would break the workflow and undermine the organization’s ability to operate safely. The GDPR is clear: if the processing is necessary, consent is not appropriate.

    The bakery faces a similar issue. They run a loyalty program where customers earn a free pastry after ten purchases. They collect names and email addresses to track points and send occasional updates. Many small businesses default to consent here too, but the same problem appears: if a customer withdraws consent, the bakery would have to delete their data and stop tracking their points. The loyalty program itself depends on the data. Consent is too unstable for something that is part of the service. The bakery needs a legal basis that reflects the fact that the processing is necessary to deliver what the customer signed up for.

    Why Legitimate Interest is a Better Alternative

    These examples show why consent is so demanding. The GDPR sets a high bar: consent must be freely given, specific, informed, and unambiguous. For the yoga studio’s newsletter, this is manageable. But for the NGO and the bakery, meeting these requirements becomes unrealistic. “Freely given” is difficult when there is any imbalance of power, such as between an NGO and its volunteers. “Specific” means you cannot bundle multiple purposes together, which quickly becomes messy for organizations with several data uses. “Informed” requires clear explanations that many small organizations struggle to write. And “withdrawal at any time” is the biggest challenge: withdrawal must be as easy as giving consent, and it must stop the processing immediately. For the NGO and the bakery, that would make their services unworkable.

    Consent also requires ongoing maintenance. You must record who consented, when, how, and for what purpose. You must track withdrawals. You must refresh consent when your purposes change. You must ensure that your consent mechanism remains compliant over time. For many small organization, this becomes a significant administrative burden. And because consent is so easy to invalidate, it becomes a weak foundation for any processing that your organization depends on.

    This is why legitimate interest often works better for small organizations. It allows processing when it is necessary for a legitimate purpose and does not override the individual’s rights and freedoms. It is not a loophole or a shortcut; it requires a structured balancing exercise and clear documentation. But when used correctly, it offers a more stable and realistic foundation for everyday processing.

    For the NGO, legitimate interest is a natural fit. Coordinating volunteers, ensuring safety, and managing events are all legitimate purposes. Volunteers reasonably expect their information to be used this way. Withdrawal would not make sense, but individuals still retain the right to object if something feels inappropriate. Legitimate interest respects both the organizations’ needs and the individual’s rights.

    For the bakery, legitimate interest also works well. Running a loyalty program is a legitimate business purpose, and customers expect their data to be used to track points and send relevant updates. The bakery can explain this clearly without relying on a consent mechanism that could undermine the program. Customers still have rights, but the bakery is not forced into operational chaos if someone changes their mind.

    Even the yoga studio can benefit from legitimate interest for certain activities. While the newsletter should rely on consent, internal analytics, fraud prevention, or basic service improvements often fit better under legitimate interest. The key is choosing the legal basis that fits the purpose, not the one that feels safest.

    Consent is powerful when used correctly, but it is not the default. It is appropriate only when the processing is genuinely optional and withdrawal does not break the service. The yoga studio’s newsletter shows how well it can work. The NGO’s volunteer management and the bakery’s loyalty program show how quickly it becomes unstable when the processing is necessary.

    How to Transition to Legitimate Interest

    Legitimate interest often provides a more realistic and robust foundation for everyday processing. It aligns with user expectations, avoids the fragility of withdrawal, and supports the operational needs of small organizations—while still protecting individuals’ rights. To use legitimate interest properly, you must conduct a Legitimate Interest Assessment (LIA), a structured way to document your purpose, necessity, and balancing test.

    The next article will walk through how to create an LIA in a clear, practical way that fits the daily reality of a small business or NGO. If you need personalized help, just get in touch.

  • From Prompt to Policy: Secure AI Integration for Small Organizations

    The “Wild West” of Artificial Intelligence has officially packed up its wagons and left town. If you look back to the early 2020s, it felt like a digital gold rush—a chaotic, thrilling era where the only limit to what you could do with a prompt was your own imagination. But as we navigate the landscape of 2026, the atmosphere has fundamentally changed.

    For small business owners and NGO directors, the conversation has shifted from a wide-eyed “Look what this can do!” to a much more grounded, and perhaps more urgent, “How do we use this without getting a formal knock on the door from a Data Protection Authority?”

    We’ve reached a tipping point where innovation and regulation are finally shaking hands, but that handshake can feel a bit like a squeeze if you aren’t prepared. With the EU AI Act now reaching full enforcement as of August 2026, and the GDPR having matured into a sophisticated enforcement machine, ‘good enough’ compliance is now a major business liability. For a consulting firm working in the privacy space, we see the same pattern every day: organizations that want to be at the cutting edge but are terrified of falling off the cliff. This isn’t just about avoiding fines anymore; it’s about maintaining the trust of your clients and donors in a world where data is the most volatile asset you own.

    To understand where we are, we first have to demystify the regulatory giant in the room: the EU AI Act. For years, people talked about it as a distant storm on the horizon, but today, it is the weather we live in. The mistake many SMBs make is thinking the AI Act replaces the GDPR. In reality, they are two sides of the same coin. Think of the GDPR as the protector of the person—it cares about whose data you have and how you got it. The AI Act, conversely, is the protector of the process—it cares about what the machine is doing with that data and whether it’s behaving ethically.

    Most of the tools you use every day—the chat bots on your website, the AI-driven copy generators, the spam filters in your inbox—fall into what the Act calls “Minimal Risk.” But “minimal” does not mean “exempt.” By mid-2026, the transparency requirements are absolute. If a human is interacting with an AI, they must be told. “Stealth AI” is no longer a clever way to appear more “human” to your customers; it is a legal liability. The moment you cross the line into “High Risk” territory—using AI to screen CVs for a job opening, monitoring your employees’ performance, or using algorithms to decide who gets a discount or a service—you aren’t just a user of technology; you are a “deployer” of high-risk systems. This requires a level of documentation and human oversight that most small organizations simply aren’t equipped for yet.

    However, the most immediate danger to your organization likely isn’t the regulation itself, but what we call “Shadow AI.” This is the phenomenon of the well-meaning, highly efficient employee who wants to save five hours of work a week. They take a sensitive client contract, a list of vulnerable donors, or a transcript of a confidential meeting and paste it into a free, public version of a popular AI tool to get a summary or a draft. In that single click, your data has entered a “black hole.”

    In public-facing AI models, the information you provide is often absorbed into the collective “brain” of the machine to train future versions. We’ve already seen instances in 2026 where proprietary business strategies or private donor details were inadvertently surfaced as “suggestions” to other users on the other side of the world. For an SMB, that’s a lost competitive advantage. For an NGO, it’s a catastrophic breach of trust with the very people they exist to protect.

    The solution isn’t to ban AI—that’s like banning the internet in the 90s. The solution is to move toward private, enterprise-grade environments. Most modern CRMs and project management tools now offer “Private Instances” where your data stays within your specific “tenant.” It doesn’t leave your walls to train the global model. If you haven’t audited your software settings recently, you’re likely still operating on default settings that favor the AI provider’s data-hungry training needs rather than your privacy. One of the most effective things you can do today is to hunt down every “Data Training” toggle in your tech stack and flip it to OFF.

    Beyond the technical settings, you need a cultural shift in how your team handles data. We recommend a “Coffee Shop” rule: If you wouldn’t read a document out loud in a busy coffee shop, you shouldn’t paste it into a public AI tool. This is where anonymization becomes your best friend. In the GDPR world, if you strip away the names, addresses, and identifiers before the data hits the AI, your risk profile drops to almost zero. It takes an extra minute to change “John Doe from London” to “Client A,” but that minute could save you from a multi-thousand-euro fine.

    Another reality of 2026 is the death of the “black box.” Under Article 22 of the GDPR, and reinforced by the AI Act, individuals have a right to an explanation. If your AI helps you decide who gets a loan, which job candidate gets an interview, or which beneficiary receives aid, you must be able to explain the “why.” If your system is so complex that you can’t trace the logic of its decisions, you are effectively flying a plane without an instrument panel. You must keep a “human in the loop.” AI should be a co-pilot that offers suggestions, but the final, legal, and ethical responsibility for every decision must land on the desk of a human being.

    For NGOs, this is even more critical. Non-profits often handle what the law calls “Special Categories” of data—health records, political affiliations, or religious beliefs. In the eyes of a regulator, mishandling this data is a much more serious offense than a retail store losing an email list. For an NGO, privacy isn’t just a compliance checkbox; it’s an extension of the “Do No Harm” principle. When you use AI to translate documents for a refugee or to predict where aid will be needed next, you are handling the digital lives of people who are already at risk. Your AI strategy must reflect that.

    So, where does this leave you? It’s easy to feel like the walls are closing in, but there is a massive silver lining here. In 2026, privacy is no longer a boring back-office concern; it is a brand differentiator. We are seeing a “flight to quality” among consumers and donors. People are tired of feeling like their data is being harvested by invisible machines. When an SMB can say to a client, “We use AI to give you faster service, but we use a secure, private instance that guarantees your data never leaves our sight,” that business wins. When an NGO can demonstrate to a donor that their gift and their personal data are protected by ironclad “Privacy-by-Design” principles, that NGO builds a level of loyalty that no marketing campaign can buy.

    The era of experimentation was fun, but the era of accountability is where the real value is built. You don’t need a million-euro legal budget to get this right; you just need a clear policy, the right tool settings, and a commitment to transparency. Compliance isn’t the hurdle that stops you from running the race; it’s the track that makes the race possible in the first place.

    The landscape is moving fast, and local laws—like the UK’s expected 2026 AI legislation—are continuing to align with these high standards. This isn’t a trend that will fade; it’s the new baseline for doing business in the 21st century.

  • From Uncertainty to Confidence: Why a Small Business Needed a GDPR PIA

    A small Scandinavian retail business had been expanding steadily, evolving from a charming physical shop into a successful online store with a growing customer base. The owner took pride in offering high‑quality products and personal service, but as the digital side of the business grew, so did a quiet sense of uncertainty. The company collected customer information through online orders, loyalty programs, newsletters, and several third‑party tools, yet there was no clear overview of how all this data moved through the business. Nothing seemed wrong on the surface, but the owner couldn’t shake the feeling that something important might be overlooked. That feeling intensified the day a customer emailed asking how long their purchase history was stored and whether they could receive a copy of all the data the company held about them. The owner wanted to respond transparently but didn’t know where to start. It was the kind of moment that often pushes small businesses to seek structured guidance.

    What many small businesses don’t realize is that GDPR actually requires a Data Protection Impact Assessment (DPIA/PIA) in certain situations. A PIA becomes mandatory when a business processes personal data in ways that are likely to pose a high risk to individuals’ rights and freedoms. This includes activities such as systematic monitoring, large‑scale processing of customer data, using new technologies that track behavior, profiling for marketing purposes, or relying on multiple third‑party tools that access customer information. Even if a business doesn’t think of itself as “large‑scale,” the combination of online sales, loyalty programs, analytics tools, and customer tracking can easily meet the threshold. In this case, the business had grown organically, adding systems and tools over time without ever evaluating how they interacted. That alone is enough to trigger the need for a PIA — not because something is wrong, but because the risk level has increased without anyone noticing.

    If I had been brought in as a consultant, my first step would have been to reframe the situation. Many small business owners fear that a GDPR review will expose major problems or lead to overwhelming obligations. I would have explained that a PIA is not about pointing out failures but about creating clarity, structure, and confidence. It’s a tool that helps businesses understand their data practices, identify risks they may not be aware of, and build trust with customers. That shift in perspective alone often reduces anxiety. Instead of feeling judged, owners begin to feel supported and empowered.

    The PIA process would have started with a detailed mapping of all personal data entering the business. This would include online order details, newsletter sign‑ups, loyalty program registrations, supplier information, customer service inquiries, and even data collected through the shop’s guest Wi‑Fi. Most small businesses are surprised when they see the full picture. What feels like a simple operation often reveals a complex network of data flows created over time as the business grows and adopts new tools.

    Once the data map was complete, I would have assessed potential risks. These typically aren’t dramatic but are meaningful. In a case like this, it’s common to find that customer data is stored indefinitely because no retention policy has ever been defined. Marketing consents may be bundled together, making it unclear whether customers explicitly agreed to receive promotional emails. Third‑party plugins may have vague or outdated data‑processing terms. Staff may be unsure how to handle data‑access requests, and there may be no documented process for responding to them. These issues are not signs of negligence; they are signs of a business that has grown faster than its internal structure. A PIA is designed to reveal these blind spots before they become real problems.

    The next step would be turning findings into practical, manageable recommendations. I would propose implementing a clear retention period for customer purchase history, updating the privacy notice with transparent, plain‑language explanations, separating marketing consent from account creation, replacing any questionable plugins with GDPR‑compliant alternatives, and introducing a simple internal process for handling data‑access requests. I would also recommend a short training session for staff to ensure everyone understands the basics of personal data handling. The goal is always to make the solutions realistic and aligned with how the business already operates, not to impose unnecessary complexity.

    If the business had implemented these recommendations, the transformation would have been noticeable. Staff would feel more confident because they finally understood what personal data means in practice and how to handle it responsibly. The updated privacy notice on the website would be clear, friendly, and easy for customers to understand. A simple form for data requests would make the process transparent and efficient. The marketing list would become more engaged after consent was clarified, and the business would gain better oversight of its third‑party tools. Even internal systems would run more smoothly once unnecessary old data was removed according to the new retention policy.

    But the most significant change would be the owner’s mindset. What once felt like a looming threat would begin to feel like a strength. Instead of fearing GDPR, the owner would see it as part of running a trustworthy, modern business. A PIA doesn’t just reduce legal risk; it improves operational efficiency and strengthens customer trust. It creates a foundation the business can build on as it continues to grow. In similar cases, owners often receive positive feedback from customers who appreciate the transparency and clarity of the updated privacy information. That kind of response confirms the value of the work. It shows that GDPR compliance isn’t just a regulatory requirement — it’s a way to demonstrate professionalism and respect for customers.

    If you’re curious about how a PIA could strengthen your own business, you’re welcome to reach out anytime. I’m always happy to discuss your situation, answer questions, and help you get a clearer understanding of your data practices.