From Chaos to Control: AI and ML in Modern Data Governance

From Chaos to Control: AI and ML in Modern Data Governance

"Without data governance, AI is a huge liability. While AI and ML are transforming data governance by automating tasks, improving data quality and enabling predictive insights, they also require careful attention to ethics, compliance and human oversight," says our HSLU lecturer, Dr Dimitrios Marinos. Find out more about the benefits of integrating AI and ML into data governance by reading the full article.

Shortcuts:
Intro | The significance of metadata | Preventing biases in data | Thinking about data privacy | Conclusion | Info-Events | Programme Information | Contact

Dr Dimitrios Marinos, our lecturer at HSLU, has deep expertise in artificial intelligence, big data analytics, digital transformation, AI ethics, data governance and more.


What role do AI and ML play in the data governance landscape?

Artificial intelligence (AI) and machine learning (ML) are rapidly reshaping the data governance landscape, bringing with them both significant opportunities and challenges. As organisations increasingly recognise the value of data as a critical asset, the integration of AI and ML into data governance frameworks is becoming essential. These technologies not only improve the efficiency and accuracy of data management processes, but also enable organisations to gain deeper insights and make more informed decisions. However, the adoption of AI and ML in data governance requires careful consideration of ethical implications, regulatory compliance and the need for human oversight.

Traditionally, these tasks have been labour-intensive and prone to human error. AI and ML are particularly transformative in automating various data governance tasks, such as data cleansing, preparation, and quality management. With AI and ML, organisations can automate the identification and correction of data inconsistencies, duplicates, and errors across large data sets. This automation not only reduces the time and resources required for data management, but also ensures that data is accurate, reliable and ready for analysis. By using AI-driven algorithms, organisations can maintain high data quality standards with minimal manual intervention.

The significance of metadata

Metadata, often referred to as ‘data about data’, plays a critical role in understanding the context, structure and governance of data assets. Beyond data cleansing, AI and ML can significantly improve the process of metadata management and data lineage. AI-powered tools can automate the collection, cataloguing and updating of metadata, ensuring that it remains consistent and up-to-date across the organisation. This automation facilitates better data discovery and use, as stakeholders can quickly find and understand the data they need. In addition, AI-driven data lineage tools provide a clear visualisation of how data flows and transforms within the organisation, enabling better impact analysis and traceability. This is particularly important in industries with strict regulatory requirements, where understanding the provenance of data is essential for compliance.

Another area where AI and ML are having a significant impact is in predictive analytics within data governance. By analysing historical data patterns, ML models can predict future trends, detect potential problems and identify opportunities for optimisation. For example, AI can be used to predict data quality issues before they occur, allowing organisations to take proactive steps to address them. This predictive capability is invaluable in maintaining the integrity of data assets and ensuring that decision-makers have access to accurate and timely information. In addition, AI-driven analytics can deliver personalised insights to users based on their specific needs and behaviours, improving the overall user experience and enabling more targeted decision making.

How to prevent biases in data

As AI systems become more involved in decision-making processes, it is crucial to ensure that these systems operate transparently and without bias. The integration of AI and ML into data governance also presents challenges, particularly in terms of ethical considerations and regulatory compliance. AI algorithms can inadvertently perpetuate existing biases in data, or make decisions that are difficult for humans to interpret. To address this, organisations need to implement rigorous testing, validation and monitoring of AI models to ensure fairness and accountability. In addition, there is a growing need for clear guidelines and regulations governing the use of AI in data governance to protect individual rights and prevent misuse.

 

Don’t forget about data privacy

Data privacy is another critical concern in the AI-driven data governance landscape. With the advent of stringent data protection regulations, such as the General Data Protection Regulation (GDPR) in Europe and the California Consumer Privacy Act (CCPA) in the United States, organisations must ensure that their AI and ML systems comply with these laws. This includes implementing robust data classification, anonymisation and encryption practices to protect personal and sensitive data. In addition, AI can be used to automate compliance tasks, such as monitoring data access, detecting breaches, and managing the rights of data subjects. However, this requires a delicate balance between leveraging AI for efficiency and maintaining control over data to meet regulatory obligations.

Conclusion

Despite these challenges, the benefits of integrating AI and ML into data governance are undeniable. By automating routine tasks, improving data quality, enhancing metadata management and enabling predictive analytics, AI and ML can help organisations unlock the full potential of their data assets. However, successful implementation of AI-driven data governance requires a thoughtful approach that considers the ethical, regulatory and operational implications. Organisations must invest in the necessary tools, skills, and frameworks to harness the power of AI while maintaining trust, transparency, and compliance.

As these technologies continue to evolve, they will play an increasingly important role in helping organisations effectively manage and leverage their data assets. However, the adoption of AI and ML in data governance must be accompanied by a commitment to ethical practices, regulatory compliance, and human oversight to ensure that the benefits are realised without compromising trust and accountability. As we move forward, the organisations that successfully navigate these challenges will be best positioned to thrive in the data-driven economy. AI and ML are revolutionising data governance by automating processes, improving data quality and enabling predictive insights.

We would like to thank Dr Dimitrios Marinos for his dedication and for sharing these valuable insights.


Data is the resource of the 21st century!
Register and join us for a free online Information-Event:

Monday, 11 August 2025 (Online, English)
Monday, 8 September 2025 (Online, German)
Monday, 6 October 2025 (Online, English)
Monday, 3 November 2025 (Online, German)

Programme Info: MSc in Applied Information and Data Science
More Field Reports & Experiences: Professional portraits & study insights
Frequently Asked Questions: FAQ

The Data Guardian from Sonova: A data science research project by Caroline Wanyonyi

The Data Guardian from Sonova: A data science research project by Caroline Wanyonyi

Caroline Wanyonyi knows the world like the back of her hand. The Computer Scientist has worked as a Solutions Architect in Dubai, San Francisco and Paris. She completed her Master's degree in Applied Information and Data Science at the Lucerne University of Applied Sciences and Arts, and in this interview she gives us an insight into her research project. Find out why Caroline's heart beats for databases and how a good data governance framework can unlock the value of corporate data.

Shortcuts:
IntroThe Project | Results and Findings
Info-Events | Programme Information | Contact

Caroline Wanyonyi, graduate of the MSc in Applied Information and Data Science at the Lucerne University of Applied Sciences and Arts.


First of all, tell us something about yourself: What hashtags best describe you?

#FamilyFitnessFun
#ExploreDreamDiscover
#WellnessEQHumor
#DataDrivenLiving

Tell us more about them.

#FamilyFitnessFun: Life in my 50s with a preteen and a teen is chaotic but fulfilling. We hike, ski, and bike regularly and prioritise healthy living and having fun in our family. Although it’s becoming harder to convince kids of the importance of healthy eating and physical activity, I haven’t given up.

#ExploreDreamDiscover: As a proud citizen of Kenya and the U.S., travelling allows me to teach my children about the world firsthand and to explore different cultures and expand our horizons beyond our life in our lovely Swiss village. Inspired by Mark Twain’s philosophy of living without regrets, I embrace new experiences and adventures.

#EQWithHumor: I strive for continuous improvement in all aspects of life. I have found the concept of emotional intelligence to be particularly enlightening, and it has helped me to grow and develop while always maintaining a sense of humour.

#DataDrivenLiving: Lifelong learning is a core value for me. I am always eager to acquire new skills and learn new languages. And I read extensively, which keeps me mentally sharp and adaptable.

Volunteering enriches my life by connecting me with diverse people and industries, fostering a balanced, community-focused, fulfilling and evolving lifestyle.

About your job: What do you do at the Sonova Group?

As an Embedded Test Engineer, I test the mobile Software Development Kit (SDK) application as well as hearing instruments. I find helping to improve someone’s quality of life to be incredibly rewarding. In this role, I’ve learned a lot about working with medical instrument firmware, software, and mobile applications, all of which I find interesting and enriching because it helps to better understand the joint benefits of healthcare technology and software development.

What did you do before and why did you join the Sonova Group?

My background in computer science gives me flexibility in the type of work I do and has allowed me to gain experience in many fields. Before moving to Switzerland, I lived and worked in Dubai, Abu Dhabi, Paris, and San Francisco as a Solutions Architect for Oracle, which broadened my perspective and helped me develop my skills. I worked with clients on CRM enterprise architectures through all project stages involving hardware, database servers, networks, and operating systems. I was particularly drawn to databases and have experience with Oracle, MS SQL Server, DB2, and Sybase. After relocating to Switzerland, I paused my career so that I could focus on my family and figure out ways to best balance my professional and family life.

The Project

Please tell us about your research project.

I designed a data governance framework for Sonova by evaluating their current data governance strategy and researching various data governance models. This meant reading extensively, studying their in-house documentation and conducting stakeholder interviews to learn about the crucial aspects of this issue. I adapted an established framework to fit Sonova’s specific needs and conducted an interactive workshop to introduce it and gather more information. This workshop validated my approach and encouraged stakeholders to actively shape the company’s data governance strategy.

What data and method did you use, and what did you learn or hope to learn?

I used a mixed-method approach to engage key stakeholders through interviews and questionnaires to understand the data landscape and identify the pain points. My thesis included a detailed framework model, documentation, an environment assessment, a framework rationale, an implementation section, and a roadmap for further steps. The insights aimed to highlight the current state of data governance and identify the specific challenges and opportunities that are unique to Sonova. During the final workshop, we had lively discussions and provided stakeholders with a clear, actionable path forward.

When data is in flow: This illustration is about creating a data governance policy, which serves as the basis for the data governance framework.
From “Accuracy” to “Verification”: Caroline Wanyonyi used a word cloud in the workshop to provide stakeholders with brainstorming ideas for creating principle statements for the data governance policy.

 

Results and Findings

How can your insights help society?

Effective data governance is crucial in today’s data-driven world, but we often lack detailed information about how to assess, design and implement it effectively. My thesis addresses this gap by providing a clear and practical guide for organisations. Data governance is an ongoing process requiring resources, delegation and changes in the organisational culture. These insights can help organisations leverage data as an asset, driving value creation and ensuring continuous improvement. A well-structured data governance framework can improve data quality, compliance and strategic decision-making within organisations.

How would you like to pursue your project in future?

When I started the Master’s programme, I wanted to write a thesis that I could either publish or that someone could implement. My project remains confidential, but the next phase involves several preliminary steps for implementation. I am eager to be actively involved in bringing this data governance framework to life. My background as a Solutions Architect helps me a lot with managing the complexities of data governance.

How did your studies influence the project?

The programme offered a broad experience with various courses, including programming, leadership, human-centred design, data ideation, AI modelling, legal aspects of data, and cloud architecture. Each course covered essential topics, highlighting different facets of data science. Accessing high-quality data was a challenge throughout our project work, underscoring the importance of data governance. The programme covered both the business and technical sides of my experience, which I then applied in detail in my thesis by drawing on my knowledge of research design, interviews, workshops, visualisations and stakeholder presentations.

What advice would you give others starting on a similar project?

Start thinking about your thesis topic early and, if possible, reserve your last semester only for your thesis. Avoid rushing, as your thesis should reflect the depth of your knowledge and efforts. It’s a significant milestone, and you want it to be convincing and relevant. Maintain an ongoing dialogue with stakeholders and involve them at every stage. Set a realistic timeline and start early, as coordinating things with busy stakeholders can be challenging. Data governance projects are people-oriented and thus require you to engage with senior leaders. Having a client for your thesis means having a real-world scenario, which will make the theoretical aspects more practical and help you transition into your career.

And finally, what new hashtag are you aiming for?

#DGProgramLeader: I am dedicated to supporting organisations in leveraging data as an asset by implementing effective data governance and staying current with industry trends.

We would like to thank Caroline Wanyonyi for her dedication and for sharing these valuable insights.


Data is the resource of the 21st century!
Register and join us for a free online Information-Event:

Monday, 11 August 2025 (Online, English)
Monday, 8 September 2025 (Online, German)
Monday, 6 October 2025 (Online, English)
Monday, 3 November 2025 (Online, German)

Programme Info: MSc in Applied Information and Data Science
More Field Reports & Experiences: Professional portraits & study insights
Frequently Asked Questions: FAQ

Data-driven diabetes testing: A data science research project by Yasmine Mohamed

Data-driven diabetes testing: A data science research project by Yasmine Mohamed

Yasmine Mohamed is determined to find non-invasive, data-driven ways to help people with type 1 diabetes. She used data science methods to collect data from commercially available wearable devices, a continuous glucose monitor (CGM) and diary notes to train machine learning models to predict hypoglycemia - a condition that can be dangerous during sleep. Find out what Yasmine has learnt from her research and how it can help our society in the fight against chronic diseases such as diabetes.

Shortcuts:
IntroThe Project | Results and Findings
Info-Events | Programme Information | Contact

Yasmine Mohamed, graduate of the MSc in Applied Information and Data Science at the Lucerne University of Applied Sciences and Arts.


First of all, tell us something about yourself: What hashtags best describe you?

#Multipotentialite
#Positive
#ExpatLife
#NatureLover 

Tell us more about them.

A #Multipotentialite has more than one passion and a range of interests. In my case, I went from pharmacy to biotechnology to data science. In another life, I could have been a writer, an engineer, a craftsperson or an interior designer. I like to always have a #Positive mindset. Even when things may not be going my way, I look for the silver lining and make the best out of the situation. I have been an expat for many years and have lived in three countries. #ExpatLife has definitely shaped me in many ways. It can be hard sometimes, but also very rewarding to experience life from a different perspective. It’s a guaranteed way to stretch your mind. #NatureLover. One of the things I love most about living in Switzerland is the picturesque views and the opportunity to immerse myself in nature. It’s a blessing.

About your job: What do you do at the moment?

I have been fully dedicated to the Master’s programme. Before that, I took a career break, and before that I worked in academia as a teaching and research assistant.

What did you do before and why did you decide to do the Masters?

I became interested in data while working on my Master’s in biotechnology previously. The thesis project mainly involved wet lab work, but eventually it included a step of DNA sequencing where I had to analyse the data by using bioinformatics tools. This fascinated me, and I was tempted to pursue a bioinformatics or data science degree. After much deliberation, I decided on data science, as it is a broader field where I would learn about a wide range of technologies, concepts and data types.

The Project

Please tell us about your research project.

Given my background in healthcare and life sciences, I was keen on choosing a thesis project in health data science. Recently, several research groups have been looking for ways to use the huge data pools collected from wearable devices such as heart rate monitors and step counters and to help people with chronic conditions, for example diabetes, to manage their illness. Diabetics are at risk of hypoglycemia, i.e. very low blood sugar levels, which can be dangerous, especially when it sets in during sleep. They therefore have to check their blood sugar levels regularly. To date, sugar levels can be measured only with invasive methods, such as finger pricking or a continuous glucose monitor, a device attached to the body that measures sugar levels every five minutes.

The research question is: “Can we use health data collected by wearables (such as heart rate, blood oxygen levels … etc.) to detect hypoglycemia non-invasively?” Research in this regard predominantly focuses on medical-grade wearables because of their accuracy. However, our project studies the possibility of achieving good results using commercial-grade wearables. If successful, the results would mean a more practical method that puts such wearables in easier reach for end users.

To move ahead with this project, I needed funding and connections, which I luckily received from everyone involved. So, here’s a huge Thank You to all who made this project possible!

From the idea to the machine learning model. Overview of the project design.
From the idea to the machine learning model. Overview of the project design.

Results and Findings

What data and method did you use, and what did you learn or hope to learn?

Once we received the approval from the Ethics Committee, we started recruiting Type 1 diabetic patients. The participants were given two wearable devices. The first was an Apple Watch Series 8 to collect health data on heart rates, heart rate variability, blood oxygen levels, respiratory rates and step counts, among other things. The second was a continuous glucose monitor (CGM) to measure sugar levels every five minutes.

The data was collected over ten days, during which the participants entered information about their mealtimes, medication and physical activity in a diary. After the ten days, we combined the data from the Apple Watches, CGM and the diary, prepared it and used it to train machine learning models to predict hypoglycemia.

The results look promising. Despite the many limitations of using a commercial wearable, we could build models that perform well. We also understood which variables were necessary for the model to achieve a good result.

Day and night data. Models were built separately for data collected during the day and at night. This figure shows the contribution of different feature groups model predictions.
Day and night data. Models were built separately for data collected during the day and at night. This figure shows the contribution of different feature groups model predictions.
A look at the collected data. Model performance for one of the participants.
A look at the collected data. Model performance for one of the participants.

How can your insights help society?

This pilot project provides promising insights about integrating algorithms for hypoglycemia warnings into commercial-grade wearables. Managing a chronic disease like diabetes is complex, and thus providing patients with an effective means of managing their condition can make a huge difference.

 

How would you like to pursue your project in future?

Two things come to mind in connection with substantiating the project results. Firstly, I want to look into using other commercial wearables such as a FitBit to ensure that the results are reproducible and to study this topic on a larger scale. Secondly, I think it’s worth it to conduct a similar study with children, as they unfortunately also suffer from Type 1 diabetes.

How did your studies influence the project?

The idea for the thesis was actually inspired by a project required for one of the modules. During the Master’s programme, I learned about many concepts and tools that I needed for the project, such as Python coding practices, data wrangling methods, different machine learning techniques, AWS services, among others.

What advice would you give others starting on a similar project?

Dare to do something new, work with supportive people and be very patient (I had to extend my thesis submission deadline twice)! Take your time to process the data before proceeding with modelling and watch out for the garbage-in-garbage-out effect!

And finally, what new hashtag are you aiming for?

#Growth both at the personal and the career levels.

We would like to thank Yasmine Mohamed for her dedication and for sharing these valuable insights.


Data is the resource of the 21st century!
Register and join us for a free online Information-Event:

Monday, 11 August 2025 (Online, English)
Monday, 8 September 2025 (Online, German)
Monday, 6 October 2025 (Online, English)
Monday, 3 November 2025 (Online, German)

Programme Info: MSc in Applied Information and Data Science
More Field Reports & Experiences: Professional portraits & study insights
Frequently Asked Questions: FAQ

The crucial role of data quality in effective data governance

The crucial role of data quality in effective data governance

"Ensuring high data quality is essential for effective data governance as it impacts decision making, regulatory compliance, risk management and customer satisfaction," says Dr Dimitrios Marinos in his guest article. Robust data governance frameworks can improve operational efficiency and drive innovation by maintaining data accuracy, completeness, consistency, timeliness and relevance. Our HSLU lecturer recommends investing in data quality to enable organisations to fully leverage their data assets and gain a strategic advantage in the digital age. Read the article to find out more.

Shortcuts:
Intro | Consequences of poor data quality | The benefits of high quality data | Examples | Info-Events | Programme Information | Contact

Dr Dimitrios Marinos, our lecturer at HSLU, has deep expertise in artificial intelligence, big data analytics, digital transformation, AI ethics, data governance and more.


Data governance and data quality go hand-in-hand

In the digital age, data has become one of the most valuable assets an organisation has. However, the power of data can only be effectively harnessed through robust data governance. Data governance encompasses the processes, policies and standards that ensure data is managed, protected and used in a way that supports the organisation’s objectives while complying with relevant regulations.

Data quality is a cornerstone of effective data governance, as it is a critical element in the quest for reliable, accurate and valuable information for decision-making within organisations. As data becomes an increasingly important asset in the digital age, ensuring its quality is paramount to achieving strategic objectives, maintaining regulatory compliance and fostering stakeholder trust.

Data quality encompasses several dimensions, including accuracy, completeness, consistency, timeliness and relevance. Together, these attributes ensure that data is fit for purpose. Inaccurate data can lead to incorrect conclusions, misguided strategies and, ultimately, financial losses. For example, a retail company relying on inaccurate sales data might overstock unpopular products or understock bestsellers, resulting in lost revenue and customer dissatisfaction.

Consequences of poor data quality

Completeness is a critical aspect of data quality. Incomplete data can lead to a distorted understanding of business operations and customer behaviour. For example, if a financial institution lacks comprehensive data on its customers’ credit histories, it may incorrectly assess their creditworthiness, leading to higher default rates. Data consistency ensures uniformity across different systems and databases, which is critical for integrated analysis and reporting. Discrepancies in data between departments can cause confusion, hinder collaboration and hamper strategic initiatives. Data timeliness is crucial in fast-paced environments where decisions need to be made quickly. Out-of-date data can render insights irrelevant and hinder an organisation’s agility. In sectors such as finance and healthcare, where real-time data is often required, delays can have serious consequences. Relevance ensures that data is fit for purpose and delivers value. Irrelevant data can clutter databases, complicate analysis and distract from critical insights.

Figure 1: The core dimensions of data quality in data governance 

The benefits of high quality data

Data governance refers to the framework and processes established to manage the quality, accessibility, usability and security of data within an organisation. Effective data governance ensures that data assets are well managed and used to their full potential. It involves defining roles and responsibilities, setting standards and implementing policies to maintain data integrity throughout its lifecycle. A robust data governance framework is crucial to maintaining data quality. It establishes clear protocols for data entry, storage, maintenance and use, ensuring consistency and accuracy. Data governance bodies, often made up of representatives from different departments, oversee the implementation of these protocols and foster a culture of accountability and continuous improvement.

One of the most important implications of data quality for data governance is the improvement of decision-making capabilities. High quality data enables organisations to make informed, evidence-based decisions. In contrast, poor data quality can lead to misinformed decisions, strategic missteps and increased risk. For example, an organisation with accurate and timely market data can better anticipate trends, adjust strategies and maintain a competitive edge. Conversely, decisions based on inaccurate data can lead to missed opportunities and strategic failures.

Data quality also plays a critical role in regulatory compliance. Organisations across a wide range of industries are subject to stringent data management, privacy and reporting regulations. High quality data ensures that organisations can meet these regulatory requirements accurately and efficiently. Failure to comply can result in legal penalties, financial loss and reputational damage.

In healthcare, for example, inaccurate patient data can lead to violations of laws such as the Health Insurance Portability and Accountability Act (HIPAA), resulting in significant fines and loss of trust. Data quality also underpins effective risk management. Accurate and complete data enables organisations to identify, assess and mitigate risk more effectively. In the financial industry, for example, reliable data is essential for assessing credit, market and operational risk. Poor data quality can obscure potential threats and lead to inadequate risk responses, increasing the likelihood of adverse events.

 

From customer satisfaction to strategic initiatives and beyond

Customer satisfaction and trust are heavily influenced by data quality. Inaccurate or inconsistent data can lead to poor customer experiences, undermining trust and loyalty. For example, a customer who receives incorrect billing information from a service provider is likely to be dissatisfied and may seek alternatives. Ensuring data quality helps to maintain positive customer relationships, enhancing brand reputation and customer retention. Operational efficiency is an area where data quality has a profound impact. High quality data streamlines processes, reduces errors and minimises rework. In supply chain management, accurate data on inventory levels, demand forecasts and supplier performance enables efficient operations and cost savings. Conversely, poor data quality can lead to inefficiencies, increased operating costs and wasted resources.

Data quality also facilitates innovation and strategic initiatives. Accurate and relevant data is the foundation for identifying opportunities, optimising processes and developing new products or services. In the era of big data and advanced analytics, the ability to leverage high-quality data is a competitive differentiator. Organisations with superior data quality can leverage artificial intelligence and machine learning to gain deeper insights and drive innovation. Implementing effective data quality management requires a combination of technology, process and people. Advanced data quality tools and technologies, such as data profiling, cleansing and validation, are essential for identifying and correcting data issues. But technology alone is not enough. Organisations need to establish comprehensive data quality processes, including regular audits, monitoring and continuous improvement initiatives.

Equally important is fostering a data-driven culture within the organisation. Employees at all levels should understand the importance of data quality and their role in maintaining it. Training programmes, clear communication and incentives for data stewardship can help embed data quality principles into the fabric of the organisation. management, improve customer satisfaction and drive operational efficiency. As data continues to grow in volume and complexity, prioritising data quality through robust governance frameworks is essential for organisations seeking to thrive in the digital age. By investing in data quality, organisations can unlock the full potential of their data assets and gain a strategic advantage in an increasingly data-driven world.

We would like to thank Dr Dimitrios Marinos for his dedication and for sharing these valuable insights.


Data is the resource of the 21st century!
Register and join us for a free online Information-Event:

Monday, 11 August 2025 (Online, English)
Monday, 8 September 2025 (Online, German)
Monday, 6 October 2025 (Online, English)
Monday, 3 November 2025 (Online, German)

Programme Info: MSc in Applied Information and Data Science
More Field Reports & Experiences: Professional portraits & study insights
Frequently Asked Questions: FAQ

Sports Hackdays 2024 – Participants welcome!

Sports Hackdays 2024 – Participants welcome!

Discover. Connect. Hack. That's the motto of the upcoming Sports Hackdays 2024. Find out why Dr Thomas Wüthrich's organising team believes that "prototyping for the world of sport" is the next big thing and why it is important to continue the legacy of the Lucerne University of Applied Sciences and Arts since 2019. If you want to be part of this mission, apply now for Sports Hackdays 2024.

Shortcuts:
Intro |
Apply now | The challenges | Conditions | Info-Events | Programme Information | Contact

(From the left to the right): Dr Thomas Wüthrich (scoretec by BizNet AG) is organising the Sports Hackdays 2024 together with Martin Rumo (Lucerne University of Applied Sciences and Arts) and former Applied Data Science student Keith Lawless, supported by the Swiss Olympic Association and Swiss University Sports.

What are the Sports Hackdays?

The Sports Hackdays 2024 is a data hackathon event taking place at the House of Sport in Berne, Switzerland. Its central aim is to develop Switzerland as a major innovation hub in the world of sport by bringing together people from sport, the sports tech industry and academia.

“Together with our key partner Swiss Olympic Association and supported by Swiss University Sports, we strongly believe in the power of rapid prototyping to replace weeks of meetings with two days of action”, says Dr Thomas Wüthrich. Incidentally, he attended the Sports Hackdays in 2021 and was a Challenge Coach in 2022. Now, he is organising the two-day event together with an organising committee of data science students and lecturers, partner organisations and like-minded people. In doing so, the team is continuing a legacy that began and evolved at the Lucerne University of Applied Sciences and Arts.

“It is our vision that the Sports Hackdays will become a key initiative to win the sports innovation game for Switzerland by driving advances in sports data analytics and fostering a vibrant innovation community”, says Dr Thomas Wüthrich.

The Sport Hackdays are derived from the idea of a hackathon – a neologism made up of the words hack and marathon. It is an event where data-driven software and hardware is developed in cross-functional teams. This often takes place within a 42-hour timeframe – derived from the marathon. So, instead of investing a lot of time and money in conceptual work on how to use existing data, solutions are sought first and possible technologies are developed in an agile manner based on tasks. In this way, the participants are confronted with concrete implementation problems at an early stage.

HSLU Sport Hackdays 2021

Watch some video impressions of the Swiss Sport Hackdays 2021.

Let’s increase the success of Swiss sports – Join us!

Join the Sports Hackdays 2024 and work as a team to solve sports data challenges. You will meet experts from academia and industry leaders, and learn first-hand about real-world challenges and opportunities from sports federations, clubs and world-class athletes. An amazing prize awaits the winning team!

SAVE THE DATE:
Date: 12 & 13 October 2024
Venue: Haus des Sports, Talgut-Zentrum 27, 3063 Ittigen bei Bern

 

What is a Sports Hackdays Challenge?

A challenge consists of two things: A well-prepared and documented dataset, and a specific sports need for information and/or its presentation. We want you to really demonstrate (“prototype”) what data can do in sport. We offer five carefully selected and prepared challenges for you to choose from. The domain areas include (but are not limited to)
– athlete development,
– team performance analysis,
– fan engagement,
– smart stadium and sustainability,
and technology categories.

“For example, in previous editions we did real-time analytics with Kafka, a video overlay with Vizrt for broadcast – they helped and brought their own super fast computer. We also had a Web3 Challenge with the International Ice Hockey Federation“, says Dr Thomas Wüthrich.

When learning becomes fun and innovative

As a participant, you will have the unique opportunity to expand your network across academia and industry. Gain real-world experience by applying and showcasing your skills – and perhaps a chance for your team’s solution to help sport immediately, as seen in previous editions. You work with like-minded and highly motivated people.

“We even provide an expert coach from academia and/or industry to support you and your ideas. This is a truly valuable opportunity to gain knowledge and practical skills while having fun and complementing classroom learning”, says Dr Thomas Wüthrich.

This is what you need:
– time and attention,
– your expertise in data science, computer science, sports science, sports management
– and any other creative talents.

“We take care of the rest, such as the venue, dedicated computing power and food and drink. If you need to sleep, we have a very reasonable accommodation offer at the Youth Hostel in Berne. In addition to the participants, we are also looking for an organisation committee member from the student body of the Lucerne University of Applied Sciences and Arts”, says Dr Thomas Wüthrich.

Interested? Any questions? Get in touch with us!

If you have any questions, please contact Keith Lawless (klawless@sportshackdays.ch) or Martin Rumo (martin.rumo@hslu.ch), who are part of the organising committee.

Contact us: info@sportshackdays.ch
Visit us: www.sportshackdays.ch

Here you can read some of the articles from the Sports Hackdays 2021:

Sports Hackdays 2021 – Challenge: Moneyball goes Football

Data Science: Hackdays Challenge – Data Science Meets Football

Data Science: Hackdays Challenge – Run Against Your Predicted Time

We would like to thank Dr Thomas Wüthrich for sharing these valuable insights.


Data is the resource of the 21st century!
Register and join us for a free online Information-Event:

Monday, 11 August 2025 (Online, English)
Monday, 8 September 2025 (Online, German)
Monday, 6 October 2025 (Online, English)
Monday, 3 November 2025 (Online, German)

Programme Info: MSc in Applied Information and Data Science
More Field Reports & Experiences: Professional portraits & study insights
Frequently Asked Questions: FAQ

Guardians of data: Ensuring ethics in the digital world

Guardians of data: Ensuring ethics in the digital world

"Ethical data governance is essential to ensure privacy, fairness, transparency and accountability in the handling of data across sectors. It involves implementing policies that protect individual rights and promote trust, with significant implications for healthcare, finance, education and public trust", says our HSLU lecturer, Dr Dimitrios Marinos. As data-driven decision making becomes more prevalent, the importance of ethical practices in data governance continues to grow, influencing legal frameworks and societal norms. Read more to learn how data and ethics are intertwined.

Shortcuts:
Intro |
Fairness, transparency, accountability | Healthcare and financial sectors | Big Data, AI and GDPR | Social justice, equity and education Conclusion |Info-Events | Programme Information | Contact

Dr Dimitrios Marinos, our lecturer at HSLU, has deep expertise in artificial intelligence, big data analytics, digital transformation, AI ethics, data governance and more.


The core of data governance

The ethical use of data is a central concern in the field of data governance, where the principles of privacy, fairness, transparency and accountability are intertwined. As data becomes increasingly central to decision making in various sectors, the need for ethical data governance frameworks becomes more urgent. These frameworks ensure that data is handled in a way that respects individual rights and societal norms, thereby fostering trust and sustainability in data-driven practices.

At its core, ethical data governance involves the development and implementation of policies that guide the collection, storage, processing and sharing of data. These policies are based on ethical principles that prioritise the protection of individual privacy, the prevention of harm, and the promotion of fairness and equity. Privacy is a fundamental element as it concerns the right of individuals to control their personal information. Effective data governance frameworks must incorporate robust privacy protections, such as minimising data collection to what is necessary, ensuring data accuracy, and securing data against unauthorised access and breaches.

Figure 1: The implications and motivations for ethical data governance 

Fairness, transparency, accountability

Fairness in data governance refers to the equitable treatment of individuals and groups in data processes. This includes preventing bias in data collection and algorithmic decision-making that could lead to discriminatory outcomes. Data governance policies should ensure that data sets are representative and that algorithms are regularly tested for bias.

Transparency is also critical to ethical data governance. Organisations must be open about their data practices and provide clear information about how data is collected, used and shared. This transparency helps build trust with stakeholders, including customers, employees and regulators.

Accountability is another crucial aspect of ethical data management. Organisations need to be accountable for their data practices, with clear roles and responsibilities for data management. This includes implementing oversight and enforcement mechanisms, such as data protection officers and independent audits. Accountability also means providing individuals with the means to challenge and correct inaccuracies in their data, and ensuring that their rights are upheld.

Examples from the healthcare and financial sectors

The implications of ethical data management are far-reaching, affecting different sectors and society as a whole. In healthcare, for example, the ethical use of data can improve patient outcomes while protecting patient privacy. Healthcare providers can use data to improve diagnosis and treatment. But they must do so in a way that meets ethical standards and legal requirements. This includes obtaining patient consent for data use, anonymising data to protect identities, and ensuring that data sharing is done with the utmost care.

In the financial sector, ethical data governance is essential to prevent fraud, protect consumer privacy and promote financial inclusion. Financial institutions collect vast amounts of personal and transactional data that, if misused, can cause significant harm. Ethical governance frameworks help these institutions manage data responsibly, ensuring that customer data is protected and used to improve service delivery without compromising privacy or fairness.

Extra players: Big Data, Artificial Intelligence (AI) and GDPR

The rise of big data and artificial intelligence (AI) presents both opportunities and challenges for ethical data governance. Big data enables organisations to uncover insights and patterns that can drive innovation and efficiency. However, the sheer volume and variety of data being collected raises privacy and consent concerns. AI systems, which often rely on large data sets, can inadvertently perpetuate biases present in the data. Therefore, ethical data governance must address these challenges by implementing practices that ensure data quality, mitigate bias, and maintain transparency in AI processes.

The implications of ethical data governance extend to regulatory and legal frameworks. Governments and regulatory bodies are increasingly recognising the importance of data ethics. They are enacting laws and regulations to enforce ethical standards. The General Data Protection Regulation (GDPR) in the European Union is a prime example, setting strict requirements for data protection and privacy. Compliance with such regulations is not only a legal obligation, but also an ethical imperative for organisations. While these regulations help establish a baseline for ethical data practices, organisations must go beyond mere compliance to foster a culture of ethics in data governance.

About social justice, equity and education

Moreover, the ethical use of data has significant implications for social justice and equity. Data-driven decision-making can either perpetuate or mitigate social inequalities. For example, biased data and algorithms can reinforce systemic discrimination in areas such as hiring, lending and law enforcement. Ethical data governance requires organisations to proactively address these issues by ensuring that their data practices do not disproportionately harm marginalised communities. This includes adopting inclusive data collection methods, auditing algorithms for bias, and engaging with diverse stakeholders to understand the societal impact of data use.

In education, ethical data governance can improve learning outcomes while protecting student privacy. Educational institutions collect a wide range of data about students, from academic performance to personal information. Ethical governance frameworks ensure that this data is used to support student success without violating student privacy. This includes obtaining informed consent from students and parents, anonymising data, and using data analytics to identify and support at-risk students in a fair and respectful way.

 

Public trust in the data-driven world is fragile

The implications of ethical data management also extend to the realm of public trust. In an era where data breaches and misuse are increasingly common, public trust in data-driven organisations is fragile. Ethical data governance helps build and maintain that trust by demonstrating a commitment to responsible data practices. Organisations that prioritise ethics in their data governance are more likely to earn the trust and loyalty of their customers, employees and partners. This trust, in turn, can become a competitive advantage that drives long-term success and sustainability.

The ethical use of data is a fundamental component of effective data governance. By adhering to principles of privacy, fairness, transparency and accountability, organisations can responsibly navigate the complexities of the data-driven world. The impact of ethical data governance is profound, influencing sectors such as healthcare, finance and education, and shaping societal norms and regulatory frameworks. As data continues to drive innovation and decision-making, the importance of ethical data governance will only grow, underscoring the need for organisations to embed ethical considerations into their data practices.

We would like to thank Dr Dimitrios Marinos for his dedication and for sharing these valuable insights.


Data is the resource of the 21st century!
Register and join us for a free online Information-Event:

Monday, 11 August 2025 (Online, English)
Monday, 8 September 2025 (Online, German)
Monday, 6 October 2025 (Online, English)
Monday, 3 November 2025 (Online, German)

Programme Info: MSc in Applied Information and Data Science
More Field Reports & Experiences: Professional portraits & study insights
Frequently Asked Questions: FAQ

Navigating the risks of data governance

Navigating the risks of data governance

"Clear policies and effective implementation are essential for robust data governance", says Dr Dimitrios Marinos in his guest article. Organisations need to develop comprehensive policies, ensure proper employee training, leverage technology and foster a data-driven culture. This will mitigate risks such as data breaches, regulatory non-compliance and poor data quality, and support long-term business success and regulatory adherence.

Shortcuts:
Intro | Data governance policies | Ineffective implementation |
Case study | Info-Events | Programme Information | Contact

Dr Dimitrios Marinos, our lecturer at HSLU, has deep expertise in artificial intelligence, big data analytics, digital transformation, AI ethics, data governance and more.


In the digital age, data has become one of the most valuable assets organisations have. However, the power of data can only be effectively harnessed through robust data governance. Data governance encompasses the processes, policies and standards that ensure data is managed, protected and used in a way that supports the organisation’s objectives while complying with relevant regulations. Two key challenges in this area are the lack of clear policies and ineffective implementation of data governance frameworks. These challenges can have a profound impact on an organisation’s ability to effectively manage its data, leading to various risks and missed opportunities.

The importance of clear data governance policies

Clear data governance policies are the foundation of any successful data management strategy. These policies define the rules and guidelines for how data should be handled, who is responsible for various data-related activities, and how compliance will be monitored and enforced. Without clear policies, organisations are vulnerable to inconsistent data management practices, increased risk of data breaches and potential non-compliance with regulatory requirements.

Figure 1: The importance of clear data governance policies in data governance 

In the modern digital landscape, data is a critical asset that drives decision-making, operational efficiency and strategic planning. However, the power of data can only be effectively harnessed through robust data governance. At the heart of this governance are clear data governance policies, which are essential to ensure that data is managed, protected and used appropriately within an organisation.

Clear data governance policies establish a standardised approach to data management across the organisation. This standardisation is critical to maintaining data quality, integrity and reliability. When every department follows the same guidelines for collecting, storing, processing and sharing data, the likelihood of errors and inconsistencies is greatly reduced. Consistent data management practices ensure that data remains accurate and trustworthy, which is essential for making informed business decisions.

Well-defined policies clarify data management roles and responsibilities. They specify who is responsible for various data-related activities, from data creation and maintenance to security and compliance. This clarity is essential to ensure that data governance tasks are performed correctly and efficiently. It also facilitates accountability, as individuals and teams can be held accountable for their data handling actions. Without clear policies, accountability becomes ambiguous, leading to potential mismanagement and neglect.

In today’s regulatory environment, compliance with data protection laws such as the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) is non-negotiable. Clear data governance policies help organisations navigate the complex landscape of regulatory requirements. These policies outline the necessary steps to ensure data is handled in accordance with regulatory standards, mitigating the risk of non-compliance. Non-compliance can result in severe penalties, legal consequences and reputational damage. In addition, clear policies help identify and manage the risks associated with data handling and ensure that data is protected from breaches and unauthorised access.

Clear data governance policies streamline data management processes, making them more efficient. By providing a roadmap for how data should be handled, these policies eliminate ambiguity and reduce the time and resources spent resolving data-related issues. Efficient data management improves overall operational efficiency, allowing organisations to focus on their core activities and strategic initiatives. In addition, clear policies facilitate better communication and collaboration between departments, as everyone adheres to the same policies and standards.

Trust is a critical component in any organisation’s relationship with its stakeholders, including customers, partners and regulators. Clear data governance policies demonstrate a commitment to privacy and ethical data management. This commitment fosters trust by reassuring stakeholders that their data is being handled responsibly and securely. At a time when data breaches and misuse are common concerns, having a transparent and well-enforced data governance policy is a significant advantage.

The consequences of ineffective implementation

Even with clear policies in place, their effectiveness depends on proper implementation. Ineffective implementation of data governance frameworks can negate the benefits of well-defined policies. This ineffectiveness can manifest itself in a number of ways, including inadequate training, lack of enforcement, insufficient resources and failure to integrate data governance into day-to-day operations.

Figure 2: The core of ineffective implementation of data governance

Even when clear policies are in place, their effectiveness depends on proper implementation. Ineffective implementation of data governance frameworks can undermine the benefits of well-defined policies. This ineffectiveness can manifest itself in inadequate training, lack of enforcement, insufficient resources and failure to integrate data governance into day-to-day operations. The consequences of ineffective implementation are significant.

Inconsistently applied data governance policies lead to data inconsistencies and poor quality, undermining the reliability of data-driven decisions because stakeholders cannot trust the data they are using. Ineffective implementation also often leads to weak data security practices, exposing sensitive data to unauthorised users and increasing the risk of data breaches. Organisations may also fail to comply with data protection regulations if they do not effectively implement their data governance policies. Regulators expect organisations to have policies in place and to demonstrate compliance. Poor implementation of data governance frameworks can lead to operational inefficiencies if data governance tasks are not integrated into daily workflows, causing employees to spend excessive time on manual data management activities that take them away from their core responsibilities.

In addition, ineffective implementation can undermine confidence in the organisation’s data governance practices. Both internal and external stakeholders need to be confident that the organisation can effectively manage and protect data. Loss of trust can damage the organisation’s reputation and its relationships with customers, partners and regulators. Ensuring the effective implementation of data governance frameworks is therefore critical to maintaining data integrity, security, compliance, operational efficiency and stakeholder trust.

 

Case study: A real-world example

To illustrate the importance of clear policies and effective implementation, let’s consider a hypothetical case study of a multinational corporation, FictionAG. FictionAG operates in a variety of industries, including finance, healthcare and retail, each with its own data governance challenges. Initially, FictionAG faced significant data governance issues due to a lack of clear policies and ineffective implementation. Different departments followed different data management practices, leading to data inconsistencies, security vulnerabilities and regulatory compliance risks. Recognising the need for improvement, the company embarked on a comprehensive data governance overhaul.

FictionAG began by developing a clear and comprehensive data governance policy. They formed a cross-functional team to ensure that the policies addressed the unique needs of each department while maintaining overall consistency. The policies covered data quality standards, access controls, data security measures and compliance requirements. To ensure effective implementation, FictionAG invested in extensive training programmes for employees at all levels. They ran workshops and produced detailed documentation to help people understand and comply with the new policies. They also established a Data Governance Council with representatives from key departments to oversee policy enforcement and address any issues.

The company also allocated significant resources to support its data governance initiatives. They implemented advanced data management and security technologies, hired data governance specialists, and increased their budget for ongoing training and monitoring. By integrating data governance tasks into daily operations, FictionAG ensured that data management practices were aligned with its policies. They introduced regular data quality checks, access reviews and compliance audits to monitor adherence to policy and identify areas for improvement.

The results of FictionAG’s data governance overhaul were remarkable. They achieved greater consistency in data management practices, significantly reduced security vulnerabilities and ensured compliance with relevant regulations. In addition, the company fostered a culture that values data governance, resulting in improved data quality and increased stakeholder trust.

We would like to thank Dr Dimitrios Marinos for his dedication and for sharing these valuable insights.


Data is the resource of the 21st century!
Register and join us for a free online Information-Event:

Monday, 11 August 2025 (Online, English)
Monday, 8 September 2025 (Online, German)
Monday, 6 October 2025 (Online, English)
Monday, 3 November 2025 (Online, German)

Programme Info: MSc in Applied Information and Data Science
More Field Reports & Experiences: Professional portraits & study insights
Frequently Asked Questions: FAQ

Real projects, real skills: Creating partnerships for learning with bydo

Real projects, real skills: Creating partnerships for learning with bydo

The bydo concept is all about optimising learning from real projects, offering training that keeps pace with the dynamics of new technologies, and providing a venue where future talent and employers can interact and learn to appreciate each other. This article explains how this initiative came about, the expectations that its founders Peter Delfosse and Professor Dr Andreas Brandenberg have, and why students – our future specialists – will benefit from this new approach to learning.

Shortcuts:
Intro | Learn more about bydo |
Info-Events | Programme Information | Contact

The team behind the initiative (from left to right): Professor Dr Andreas Brandenberg, Cosima Lang, Anuschka Henn and Peter Delfosse.

Professor Dr Andreas Brandenberg, head of the Master’s programme in Applied Information and Data Science at Lucerne University of Applied Sciences and Arts, developed the bydo concept together with Peter Delfosse, CEO of the international digitalisation company Axon. While Andreas is dreaming about extending Swiss university programmes beyond the traditional classroom, Peter is looking for talents who can deliver the value derived from the new technologies. However, they both believe it’s time for educators and practitioners to start singing more closely from the same hymn sheet. In our interview, they share their vision for the initiative.

Andreas, why do universities need to work more closely with industry?

Andreas: If we look at current job titles, the content of these jobs and the profiles they require, we will see that most of them didn’t even exist three years ago. Universities, on the other hand, generally have the luxury of taking two to three years to develop a new course. In short, planners at universities generally know little about where their programmes and courses are heading and what skills they should be teaching, and there are clear indications that practitioners and educators need to find new ways of working together more closely.

Peter, how do you address this issue in your company?

Peter: For years I’ve heard people around me bemoan the fact that the graduates they recruit from colleges or universities don’t have the skills and knowledge that are needed, that they get too few of them too late, and that they first have to train them. So, we have to ask ourselves: do we keep complaining or do we bring about lasting change in how businesses and educational institutes work together? The aim of the bydo initiative is to show that there are plenty of able people out there who are well trained and employable, who are motivated to work on projects in a range of applied fields, and who are willing to further develop their practical skills. With bydo, we are simply looking to provide a means with which to tap deeper into the resources we have. It’s not that I happen to know a lecturer at a university who happens to be doing yet another project. The bydo approach is more structured and aims to give us access to a vast set of resources that are not yet being used. That’s what the entrepreneurial side is all about.

Andreas: I think that all sides clearly have some reservations and that we have a mismatch of expectations: We have companies that don’t know how to work with students or how to engage them, we have students who don’t understand what’s required of them in terms of performance and quality, and we have entire industries that don’t realise what potential these students have.

We understand that bydo aims to meet expectations more fully among those who work together. But what exactly does it offer that can’t be taught in the classrooms at Lucerne University of Applied Sciences and Arts?

Andreas: What is crucially important in connection with enabling technologies such as data science, machine learning and artificial intelligence is that they all intrinsically have value, but this value becomes apparent only when these technologies are implemented. We lack the expertise for putting it all together. We can expect that the way we programme, apply new tools, work in low-code environments and use artificial intelligence will become relatively easy in future. But implementing these technologies to add value and become profitable for companies and organisations will always remain a creative act that requires context and cannot be taught in lectures and classrooms. Doing so needs a product and a process landscape. In other words, technical learning calls for vast layers of context that universities are less and less able to provide. What’s more, many of these technologies run on systems that universities are already struggling to afford and that may be out of reach in the medium term.

bydo is a collaborative setting in which companies, students, experts, and educational institutions can learn.

Almost every course by now offers applied modules. What makes bydo special?

Andreas: Of course, degree programmes generally already include applied projects – a lot of them! – and many master’s theses are also written for the benefit of industry partners. But that’s not enough. I believe that tackling the really big challenges calls for an approach that works independently of traditional teaching and the academic calendar. What’s crucial in our concept is that students are not left to fend for themselves but get the support they need from coaches who have a relationship with the educational institution and can guarantee the quality of the training being offered. We believe this to be the decisive factor. We have thought through a wide range of concepts and believe that bydo offers a finely balanced solution that evenly addresses the interests of the industry, educational institutions and students. In other words, all sides stand to benefit.

Peter: Yes, especially students who are quick on the uptake will benefit from bydo and won’t have to wait two or three years before they can apply what they’ve learned at university. Of course, no company is perfectly organised, as we often live from hand to mouth. So, it’s very important to plan these projects carefully and have coaches who are there for their students. After all, companies depend on these projects, which therefore must be supervised. I’m convinced that having such coaches will be the key to success so that bydo projects are of good quality. This in turn will help us ensure that we create good value for everyone involved.

 

So, you’re saying that we must use new ways to get students to acquire this expertise? What does that mean for me as a future specialist in the labour market, Peter?

Peter: Leadership is certainly a key factor. I think leadership skills are hard to acquire in an educational environment because you need to practice them, and young people often have few opportunities to do so. In our bydo projects, our students are in actual companies where they see how organisations and hierarchies work, understand what else needs to be considered, and very quickly get a realistic sense of what to expect and strive for in a specific situation.

Around 100 graduates leave our Master’s programme and enter the job market every year. What do they need to have to distinguish themselves, Andreas?

Andreas: I clearly believe that they need to fully understand what the industry expects of them. This means they must be in command of the technology as well as the setting in which is applied, as in the case of a data scientist, who has to be a specialist in the technology as well as in the context of the data he or she is using. Data scientists play an essential role in the digital transformation and thus have to know how the business and technical sides fit together.
But data scientists need to become much more creative. AI will never be able to deliver innovative, intelligent, clever solutions that fully serve specific needs in the market. But it is precisely such solutions, rather than generic technical knowledge, that add value. The data scientists we train learn first and foremost how to ask the right and relevant questions about specific real-world problems, and then set out to solve them. But let’s not forget, data and AI are sensitive topics that raise many questions, including legal ones – and especially ethical issues. So, data scientists need a solid foundation, a good basis as well as a clear idea of what’s feasible and what can be done and what should not be done.

What’s next for bydo?

Peter: We don’t want to reinvent every project from scratch but create a platform that can map a thousand or even more projects.

Andreas: We are in the process of systematically developing our modules. The industry has shown a lot of interest, and we have already been able to get companies such as ON, Swisscom, SWISS and IBM on board. For example, a company may have little experience with implementing AI in its business, and it thus would make good sense to set up a small think tank of students and ask them to work out various use cases for optimising the business processes by means of AI. Students can also carry out ongoing and exploratory tasks for which a company may lack the time, resources or knowledge. What’s more, there’s a war for talent in this area. So it’s good to engage students in the dialogue at a very early stage. Who knows, maybe bydo will even help someone to find a job.

Andreas and Peter, thank you very much for talking with us.

For more information about bydo, visit: www.bydo.swiss


Data is the resource of the 21st century!
Register and join us for a free online Information-Event:

Monday, 11 August 2025 (Online, English)
Monday, 8 September 2025 (Online, German)
Monday, 6 October 2025 (Online, English)
Monday, 3 November 2025 (Online, German)

Programme Info: MSc in Applied Information and Data Science
More Field Reports & Experiences: Professional portraits & study insights
Frequently Asked Questions: FAQ

Data governance in Switzerland: Navigating a unique landscape

Data governance in Switzerland: Navigating a unique landscape

Data governance in Switzerland is a unique blend of stringent privacy laws, a decentralised regulatory framework, high security standards, a cultural emphasis on privacy and an advanced IT infrastructure. "The distinctive Swiss approach to data governance not only protects personal data, but also fosters an innovative and secure environment in which businesses can thrive," says Dr Dimitrios Marinos, lecturer at the Lucerne University of Applied Sciences and Arts, in his guest article.

Shortcuts:
Intro | Data privacy laws | Decentralised governance structure |
Data security standards | International data transfers | Cultural emphasis on privacy | IT infrastructure and innovation | Dynamic regulatory landscape | Balancing innovation and compliance | Info-Events | Programme Information | Contact

Dr Dimitrios Marinos, our lecturer at HSLU, has deep expertise in artificial intelligence, big data analytics, digital transformation, AI ethics, data governance and more.


Switzerland’s approach to data governance is characterised by stringent privacy laws, a decentralised regulatory structure, high security standards, and a strong cultural emphasis on individual rights. This unique environment presents both challenges and opportunities for organisations operating in the country.

In this blog, we explore the key elements that define Swiss data governance, from the Federal Act on Data Protection (FADP) and its alignment with the EU’s GDPR to the dynamic regulatory landscape and the balance between innovation and compliance. Together, these factors create a robust and forward-thinking framework for data management in Switzerland. 

Figure 1: The core elements of data governance in Switzerland 


Stringent data privacy laws

The foundation of Swiss data governance lies in the country’s rigorous data privacy legislation. The Federal Act on Data Protection (“FADP”) is the cornerstone of these efforts. Recently revised in September 2023, the FADP is closely aligned with the European Union’s General Data Protection Regulation (GDPR), ensuring that Swiss data protection standards are among the highest in the world. This revised act strengthens the rights of data subjects and imposes stricter requirements on data controllers and processors. It mandates transparency in data handling, robust security measures and lawful processing practices.

One of the significant aspects of the FADP is its emphasis on the rights of individuals. Data subjects have extensive rights, including the right to access, rectify and erase their data. These rights give individuals greater control over their personal data and reflect Switzerland’s commitment to protecting personal data.

Decentralised governance structure

The decentralised governance structure requires organisations to be diligent and proactive in their compliance efforts. They must stay abreast of the specific regulations in each jurisdiction in which they operate and ensure that their data protection practices are aligned with these local laws. This approach not only ensures compliance, but also builds trust with local stakeholders.

Switzerland’s federal system adds a layer of complexity to data governance. The country’s 26 cantons each have the authority to implement their own data protection regulations and maintain their own data protection authorities. This decentralisation means that organisations must navigate both federal and cantonal laws to ensure compliance across multiple jurisdictions. This requires a tailored approach to data governance, where organisations must carefully manage and align their practices to meet different regional requirements.

High standards for data security

Swiss data governance is also characterised by high standards of data security. The country is renowned for its rigorous security measures, which often exceed those of many other countries. These measures include mandatory encryption, secure data storage solutions and stringent access controls to protect against unauthorised data breaches and cyber-attacks.

The focus on data security is designed to protect sensitive information and maintain the integrity and confidentiality of personal data. Organisations must implement robust technical and organisational measures to ensure the security of the data they process. This includes regular risk assessments, the implementation of security policies and continuous monitoring to detect and respond to potential threats.

International data transfers 

Navigating international data transfers is another critical aspect of data governance in Switzerland. The country is considered a third country by the EU for data protection purposes, but has been granted adequacy status by the European Commission. This status allows data to flow between the EU and Switzerland without additional safeguards, facilitating international operations. However, organisations must still ensure compliance with specific requirements for international data transfers, including adequate protection of data and respect for the rights of data subjects.

Adequacy status attests to Switzerland’s high data protection standards, which are considered equivalent to those of the EU. This status is a competitive advantage for Swiss companies, making it easier for them to engage in international data transfers. However, organisations must remain vigilant and ensure that they comply with all relevant regulations to maintain the integrity and security of data during international transfers.

 

Cultural emphasis on privacy 

A deep cultural emphasis on privacy and confidentiality also characterises Swiss data governance. This respect for privacy is deeply rooted in Swiss traditions, such as banking secrecy, which has historically been an important aspect of the country’s identity. This cultural backdrop influences how data governance policies are formulated and enforced. Organisations operating in Switzerland need to be particularly sensitive to public concerns about privacy and data protection, and ensure that their data handling practices are in line with these cultural values.

Swiss culture places a high value on individual privacy, and this is reflected in the country’s approach to data protection. The emphasis on privacy goes beyond legal requirements to include ethical considerations and societal expectations. Organisations need to demonstrate their commitment to respecting privacy and protecting personal data, which helps to build trust with customers and other stakeholders.

Advanced IT infrastructure and innovation

Switzerland’s advanced IT infrastructure and commitment to innovation have a significant impact on its approach to data governance. The country is a leader in areas such as financial technology, biotechnology and pharmaceuticals, which drives sophisticated approaches to data management and protection. This innovative environment encourages the use of advanced analytics and data processing techniques, while maintaining stringent compliance with data privacy laws.

The country’s robust IT infrastructure provides a solid foundation for data-driven innovation. Organisations are leveraging cutting-edge technologies, such as artificial intelligence and machine learning, to enhance their data governance practices. These technologies enable more efficient data processing, improved data quality and enhanced security measures. At the same time, the regulatory framework ensures that innovation does not come at the expense of data protection.

Dynamic regulatory landscape 

The regulatory landscape in Switzerland is dynamic and constantly evolving. Compliance with Swiss data protection laws is overseen by the Federal Data Protection and Information Commissioner (FDPIC). The FDPIC provides guidance, conducts investigations and enforces data protection regulations, playing a key role in maintaining high standards of data governance. This proactive regulatory approach addresses emerging data governance challenges such as big data, artificial intelligence, and cross-border data flows.

The Swiss regulatory framework is designed to adapt to these evolving challenges, ensuring that data protection remains robust and up to date. The FDPIC regularly updates its guidelines and recommendations to take account of new developments and emerging risks. Organisations are encouraged to keep abreast of these updates and incorporate them into their data governance practices to ensure ongoing compliance and protection.

Balancing innovation and compliance 

One of the key challenges and opportunities in Swiss data governance is balancing innovation and compliance. Switzerland’s leadership in various high-tech industries requires a sophisticated approach to data governance that supports innovation while ensuring data protection. Organisations must adopt advanced data management technologies and practices that comply with strict regulatory requirements. This balance is critical to maintaining Switzerland’s competitive edge in technology and innovation, while maintaining its reputation for rigorous data protection standards.

Organisations must strike a delicate balance between embracing new technologies and complying with regulatory requirements. This means investing in secure and compliant data management solutions, conducting regular risk assessments and fostering a culture of data protection within the organisation. In this way, organisations can drive innovation while maintaining the highest standards of data governance.

Companies that successfully navigate the complexities of Swiss data governance can gain a competitive advantage by building trust with customers and stakeholders, while leveraging cutting-edge technologies to drive growth and innovation. The commitment to high standards of data protection and security will continue to be a hallmark of Switzerland’s approach to data governance, setting a benchmark for others to follow.

We would like to thank Dr Dimitrios Marinos for his dedication and for sharing these valuable insights.


Data is the resource of the 21st century!
Register and join us for a free online Information-Event:

Monday, 11 August 2025 (Online, English)
Monday, 8 September 2025 (Online, German)
Monday, 6 October 2025 (Online, English)
Monday, 3 November 2025 (Online, German)

Programme Info: MSc in Applied Information and Data Science
More Field Reports & Experiences: Professional portraits & study insights
Frequently Asked Questions: FAQ

Professional Data Science Portrait with Leo Rettich: Data Scientist, Zürcher Kantonalbank

Professional Data Science Portrait with Leo Rettich: Data Scientist, Zürcher Kantonalbank

Handball, cryptocurrencies, cancer cells, and politics are just some of the topics Leo Rettich tackled while working on his Master's degree in Applied Information and Data Science at Lucerne University of Applied Sciences and Arts (HSLU). Looking back on it all, he says, "It's precisely this combination of content and technology that intrigues me about data science." In this interview, he explains, among other things, what he currently does as a data scientist at Zürcher Kantonalbank.

Shortcuts: InterviewInfo-EventsProgramme InformationContactProfessional Data Science Portraits

Leo Rettich is a data scientist at the Zürcher Kantonalbank and a graduate of the Applied Information and Data Science Master’s programme @HSLU 


First, tell us a bit about yourself: Which hashtags describe you the best?

#NeverStopLearning
#LeoGoesDataScience

Please tell us more.

About #NeverStopLearning: Whether in my private life, at work, or in a training course, it’s vital for me to keep learning and improving daily. This makes me feel fulfilled, that I’m doing the right thing, and that I’m staying on the ball. It’s also why I’ve been attending further education courses for as long as I’ve been working, most recently the MSc in Applied Information and Data Science at HSLU.

About #LeoGoesDataScience: After working for around ten years as a software developer, I set myself the goal of making data science part of my personal development plan, not only in terms of training but also professionally. Thanks to an internal transfer to a data science team, which my line manager communicated at the time as “Leo goes Data Science,” I reached this goal a few months ago and have since been taking my first steps in this field at Zürcher Kantonalbank.

 

What do you do at Zürcher Kantonalbank? 

As a Data Science Lab team member in the Value Stream Information Management unit, I develop key data science skills and tools within Zürcher Kantonalbank. We implement use cases for the IT, Operations, and Real Estate units and help others with their data science projects.

What did you do before, and why did you join Zürcher Kantonalbank?

I have been with Zürcher Kantonalbank for almost 15 years, including my years as an apprentice. So, the question should rather be why I’ve stayed with Zürcher Kantonalbank for so long. Well, the bank has an excellent corporate culture and strongly emphasises employees’ personal development, which certainly has contributed to my staying in this job all these years. In addition, the IT unit at Zürcher Kantonalbank offers an exciting range of career options. Over 1,000 IT employees work on numerous innovation and digitalisation projects in Zurich’s District 5 and ensure that one of Switzerland’s largest universal banks runs smoothly.

What’s the most exciting part of your job?

I love not only to dive deeply into the content we need when managing data from the various disciplines but also to tackle the technically demanding challenges that data science offers. What also fascinates me about my job is the chance to regularly explore perspectives and options for applying the latest methods and technologies – and to find out what is possible when we apply them profitably to the valuable data troves and technologies we have in our bank.

Which data scientist skills are particularly in demand in your job?

In the current phase, which involves developing the systems and tools for applying data science, it is essential to understand the full range of choices thoroughly. On the one hand, you need a technical command of the entire data value chain – from data management and data engineering, all the way to analytics and machine learning. On the other hand, you have to keep a constant overview of the latest methods and tools in a very fast-changing environment so that you can incorporate them into the design of Zürcher Kantonalbank’s data systems.

Do you see yourself more as a techie, an analysis freak, a creative genius, a management superhero or a brilliant all-rounder?

With my background as a software developer, I’m certainly more of a techie. I like solving technical problems and am generally willing to take on larger programming tasks. However, I realise that my preference for data science also tends to make me an all-rounder. My studies and job require and encourage me to develop a wide range of skills – from communication, planning, and brainstorming, all the way to very technical things like setting up the IT infrastructure.

What fascinated you the most during your studies (MSc in Applied Information and Data Science)?

In general, topics such as machine learning and artificial intelligence are the natural follow-up once you’ve learned to develop software. In contrast to traditional programming, data science doesn’t just involve problems that call for clearly formulated rules. That’s what I find so fascinating and motivating. During my studies, I could apply what I had learned in many applied projects, which covered many areas to immerse myself in. For example, in addition to all the data science methods, I also learned about crime, handball, cryptocurrencies, cancer cells and politics. It’s precisely this combination of content and technology that I find so intriguing about data science.

 

What are the biggest challenges in your job at the moment?

The biggest challenge at the moment is organising the systems and tools with which to apply data science to the use cases in our bank. We face many uncertainties at the moment about the IT architecture, tools, IT governance, data protection, the current shift to the cloud, technological progress, and many unclear responsibilities. And it takes a lot of effort to finally become productive with data science.

What advice do you have for someone who wants to follow in your footsteps?

My advice would be not to get discouraged if things don’t move ahead as quickly as expected in a large company like Zürcher Kantonalbank, but to keep at it and trust that your fascination with data science will see you through whatever happens to be in the way.

And finally, what new hashtag are you aiming for?

In Swiss German, we have #zämeMehUsDateUsehole, which basically means working together to get the most out of our data. It’s the official vision of our value stream. For me, it means collaborating within Zürcher Kantonalbank to create real added value from the available data. I still see a lot of potential here to reach this goal fully and would like to contribute more.

We would like to thank Leo Rettich for his dedication and for sharing these valuable insights.


Data is the resource of the 21st century!
Register and join us for a free online Information-Event:

Monday, 11 August 2025 (Online, English)
Monday, 8 September 2025 (Online, German)
Monday, 6 October 2025 (Online, English)
Monday, 3 November 2025 (Online, German)

Programme Info: MSc in Applied Information and Data Science
More Field Reports & Experiences: Professional portraits & study insights
Frequently Asked Questions: FAQ

fh-zentralschweiz