GARIN AI Spółka Akcyjna

GDPR

Platforms that use artificial intelligence to generate video avatars, AI video models, etc. are services that operate on user data (audio, images, video), which makes the protection of users’ personal data essential and legally required (for residents of the European Union).

1. GDPR requirements when providing services from Poland/the EU

when the service operates in the EU/Poland or serves users from the EU.

1.1. Status of administrator/processor

  • If the platform collects personal data (e.g. biometric image, recordings, avatar, or voice) from users, it becomes a data controller or joint controller – in such cases, a lawful basis for processing (consent, contract, legal obligation, etc.) is required.
  • If certain processing functions are outsourced to third parties (e.g. cloud providers or external AI services), it is necessary to sign a Data Processing Agreement (DPA) in compliance with Article 28 of the GDPR.

1.2. User consent and rights of individuals

  • A transparent privacy policy must be provided – clearly indicating what data is collected, for what purpose, how long it will be stored, and with whom it will be shared, etc.
  • Ensure users’ rights of access, rectification, erasure (“right to be forgotten”), restriction of processing, and data portability, among others.
  • If sensitive data (e.g. biometric data) is processed, it requires additional safeguards, limitations, or explicit consent.

1.3. Transfer of data outside the EU

  • If the server or AI processes continue to operate in the United States, the transfer of personal data from the EU to the U.S. requires a compliance mechanism: Standard Contractual Clauses (SCCs), mechanisms approved by the European Commission, or other measures ensuring an adequate level of data protection.
  • AI components that generate or process data may be considered a form of data processing – in such cases, data export requirements apply to both input and output data.

1.4. Data Protection Impact Assessment (DPIA)

  • If the processing involves a high risk (e.g. biometric analysis, profiling, or image manipulation), a Data Protection Impact Assessment (DPIA) must be carried out before the service is launched.
  • Implement technical controls, pseudonymization/anonimization (where feasible), access restrictions, encryption, and data minimization.

2. AI Act (EU) – new EU regulations for AI

The AI Act (Regulation (EU) 2024/1689) introduces new requirements for AI systems in the EU.

2.1. Scope of activity

  • The AI Act has a horizontal scope – it covers a wide range of applications, following a risk-based approach.
  • Depending on the function of the AI system, it may be classified as:
    • Unacceptable risk – prohibited practices (e.g. manipulation, social scoring, etc.).
    • High risk – AI systems that may affect fundamental rights (e.g. facial recognition, candidate screening, healthcare, or safety applications) — subject to strict documentation requirements, risk assessment, human oversight, and incident reporting obligations.
    • Limited risk – subject to transparency obligations (e.g. informing users that they are interacting with an AI system).
    • Minimal risk – minimal regulatory obligations apply.

2.2. Obligations for AI providers/users

  • Technical documentation and risk assessments
  • Human oversight mechanisms
  • Training data management: quality and absence of bias
  • Post-deployment performance monitoring
  • Transparency obligations – informing users of AI interaction (for limited-risk systems)
  • Incident reporting obligations (AI malfunctions or harm caused by AI)

2.3. Enforcement and schedule

  • The AI Act has entered into force and will be applied gradually — partially from 2025, while full requirements for high-risk systems may apply from 2027.
  • Penalties for violations may depend on company turnover (ranging from 1% to 7% of global turnover) or fixed amounts (up to €35 million).
  • GDPR fines (up to €20 million or 4% of global turnover).

2.4. Extraterritorial effects

Even entities outside the EU (e.g. based in the U.S.) must comply with the AI Act if they provide AI to EU users or if their AI affects users in the EU.

3. Specific risks/challenges for transferring the portal to Poland/the EU

  • Implementation of EU-based infrastructure – it is necessary to have servers/cloud operating within the EU to avoid large-scale data transfers overseas, which complicate GDPR compliance.
  • Establishing a data controller address in the EU – an EU representative or contact address may be required for data protection matters.
  • Updating consents and terms – users must give consent in accordance with GDPR, in the local language, with full information about their rights.
  • AI audit / risk assessment – assess which AI components may be considered high-risk; if video generation involves facial recognition or emotion analysis, it may qualify as high-risk.
  • Transparency / labeling AI-generated content – users should be informed when a product/output is AI-generated, where required (limited risk).
  • Reporting to authorities / cooperation – be prepared to provide documentation to the supervisory authority (Polish Personal Data Protection Office) and maintain auditability.
  • Compliance costs – documentation, audits, monitoring, legal and technical resources.

4. Recommendations for migration

  1. Mapping AI data and processes – identify which modules use personal data, which AI components are critical, and what risks they generate.
  2. Infrastructure location – deploy cloud/server infrastructure within the EU (Poland, Germany, France) to minimize transfers outside the EU.
  3. Signing SCCs (Standard Contractual Clauses) with each non-EU provider/partner if data will be processed outside the EU.
  4. Conducting DPIA / FRIA – assess impact on fundamental rights, document findings, and implement mitigation measures.
  5. Terms of Service and privacy policy aligned with GDPR and the AI Act – clear communication, consents, and user rights.
  6. AI monitoring procedures – logging, performance audits, procedures for handling AI errors/incidents.
  7. Compliance/legal team – to monitor AI law updates and ensure timely responses.

5. Comparison of differences between the US and Poland/EU

from the perspective of compliance law and practice:

5.1. Servers in the US (outside the EU)

+
Advantages
  • No need for data and infrastructure migration (lower technical costs).
  • Maintaining existing contracts with cloud providers (AWS, Azure, Google, etc.).

-
Disadvantages and limitations
  • Transfer of personal data outside the EEA requires the use of legal mechanisms:
  • SCCs (Standard Contractual Clauses), or the EU–US Data Privacy Framework (not all companies are certified, and the decision may be challenged at the CJEU).
  • Data sent to the U.S. carries the risk of being considered an illegal transfer.
  • Requirement to conduct a transfer risk assessment (TIA – Transfer Impact Assessment) and implement additional safeguards (encryption, pseudonymization).
  • Higher risk of supervision and potential fines by data protection authorities (e.g. Meta/Facebook fined for data transfers to the U.S.).
  • AI Act – server location is not decisive, but documentation and oversight of the AI system must be in the EU (so an EU representative is still required).
  • Reputation: EU clients often expect that data will not leave Europe.

5.2. Servers in Poland/EU (data remains within the EEA)

+
Advantages
  • Personal data does not leave the EEA, ensuring full GDPR compliance (no need for SCCs, TIA, or Privacy Framework).
  • Lower risk of administrative fines and legal disputes.
  • Better PR / customer trust – message: “Your data does not leave the EU.”
  • Easier cooperation with regulators (e.g., UODO).
  • Ability to appoint a Data Protection Officer (DPO) and establish clear procedures.
  • Local infrastructure also facilitates AI Act compliance – easier oversight of AI systems, risk assessments, and reporting.

-
Disadvantages and limitations
  • Costs of migrating servers, data, and services to the EU.
  • Implementation time and potential integration delays (especially if using U.S.-based APIs/AI – data may still flow outside the EU).
  • Need to establish new contracts with cloud providers (e.g., AWS Frankfurt, Google Cloud Warsaw, OVH, etc.).

5.3. Summary of differences (table)

Criterion Servers in the USA Servers in Poland
GDPR
SCC/TIA/Privacy Framework required, risk of illegality
Data in the EU – no transfer, full compliance
AI Act
Still in force, but more difficult to demonstrate supervision and control
Easier fulfillment of obligations (audits, reports)
Penalties/legal risks
Higher (CJEU precedents, e.g., Schrems II)
Lower (no transfer to a third country)
Customer trust
Lower (“US data”)
Higher (“EU data”)
Costs
No migration, lower initial costs
Cost of migration and new infrastructure
Image/compliance
Potentially negative
Positive – EU-compliant

To sum up...

Servers in the US = greater legal risk, need for additional clauses and transfer assessments.
Servers in Poland/EU = greater compliance and legal security, but costly migration.

6. Steps to transfer the portal to the EU/Poland

6.1. Scenario A – servers remain in the US

  1. Personal data transfer (GDPR)
    • Signing SCCs (Standard Contractual Clauses) with the server provider (e.g., AWS, Google).
    • Verify whether the provider is certified under the EU–US Data Privacy Framework.
    • Conduct a Transfer Impact Assessment (TIA) → assess the risk of transferring data to the U.S.
    • Implement additional technical measures: end-to-end encryption, pseudonymization, limiting the scope of data sent to the U.S.
  2. GDPR – documentation and procedures
    • Update the Privacy Policy (indicating that data may be transferred to the U.S.).
    • Maintain a Record of Processing Activities (Art. 30 GDPR).
    • Procedures for exercising data subject rights (access, erasure, portability).
    • Appoint a Data Protection Officer (DPO) in Poland/EU.
  3. AI Act – preparation
    • System classification (whether “limited risk” → obligation to inform users, or “high risk” → documentation, human oversight, compliance testing).
    • Prepare technical documentation (training data, quality control mechanisms).
    • Develop incident reporting procedures.
  4. Contracts with clients
    • Add a clause regarding data transfers to the U.S. in the Terms of Service.
    • Sign DPAs (Data Processing Agreements) with subcontractors.

6.2. Scenario B – servers in Poland/EU

  1. Data and infrastructure migration
    • Selecting a server/cloud provider within the EU.
    • Data migration plan (minimizing downtime, security testing).
    • Signing a data processing agreement with the EU provider.
  2. GDPR – simplified requirements
    • No need for SCCs or TIA → data does not leave the EU.
    • Privacy Policy indicating: “data processed exclusively within the EU.”
    • Appoint a DPO (if > 250 employees or large-scale processing of sensitive data).
  3. AI Act – preparation
    • As in Scenario A: system classification, documentation, compliance testing.
    • Easier cooperation with regulators (UODO, European Commission).
  4. Contracts with clients
    • Terms of Service + Privacy Policy → indicate that data remains within the EU.
    • Clear rules for using AI services, obligation to inform users about AI-generated content.

6.3. Common to both scenarios

  • Privacy by design & by default → minimize collected data, anonymize where possible.
  • DPIA (Data Protection Impact Assessment) → mandatory if the system processes biometric data, voice, or images.
  • AI audit → process for verifying compliance with the AI Act, documenting risks.
  • Incident policy → procedures in case of data breaches or AI errors.
  • Marketing/PR communication → any information about AI must comply with transparency obligations (AI Act).

6.4. Practical recommendation

  • Minimizing legal risks → servers in the EU (Poland/Germany) = fewer formalities, higher customer trust, easier GDPR compliance.
  • Time and cost priority → keeping servers in the U.S. is possible, but you must rely on SCCs and the EU–US Data Privacy Framework, which still carries risk (possibility of this mechanism being challenged by the CJEU in the future).