Discover how you can build a fully functional data warehouse in just 45 minutes, no deep technical expertise needed! Using Gen-AI, Victor de Graaff will demonstrate how to set up, populate, and visualize data in a BI dashboard with the help of Azure, ChatGPT, and public APIs, making advanced data engineering accessible to all.
In an era where data and artificial intelligence play a central role in business strategies, it is crucial to understand not only the new opportunities but also the latest European legislation in this field. This session offers an engaging and accessible overview of the three most influential European laws of the moment: the AI Act, the Data Governance Act, and the Data Act.
In this keynote, we will discuss how data governance can serve as a keystone for building ethical AI and digital trust. We will explore the challenges and opportunities of data governance in the context of AI, and present some best practices and frameworks for implementing data governance in AI projects. We will also share, examples and case studies, recommendations and future directions.
Met de snelle ontwikkelingen in data-democratisering en AI wordt het integreren van privacy by design in de architectuur essentieel. Het moet niet langer worden gezien als een hindernis, maar eerder als een katalysator voor deze vooruitgang. Het kwadrantenmodel van Damhof biedt hierbij een leidraad.
This seminar explores the strategic implementation of Knowledge Graph initiatives within organizations, offering a comprehensive framework that blends cutting-edge techniques with real-world case studies. It equips participants with the crucial understanding needed to make informed decisions, optimize initiatives, and unlock the transformative potential of Knowledge Graphs in today’s data-driven landscape.
This half-day workshop looks at the development of data products in detail. It also looks at the strengths and weaknesses of data mesh implementation options for data product development. Which architecture is best to implement this? How do you co-ordinate multiple domain-oriented teams and use common data infrastructure software like Data Fabric to create high-quality, compliant, reusable, data products in a Data Mesh. Is there a methodology for creating data products? Also, how can you use a data marketplace to share and govern the sharing of data products?