AI-powered cancer detection: A research project by Morteza Kiani Haftlang

AI-powered cancer detection: A research project by Morteza Kiani Haftlang

Morteza Kiani Haftlang is on a mission to harness Artificial Intelligence (AI) for early cancer detection. With a background in engineering, AI, and deep learning, he transitioned into healthcare to apply his skills where they matter most. His research at IMAI MedTec explores self-supervised AI models for detecting cancer in 3D light sheet microscopy (LSM) images, aiming to enhance accuracy and reduce manual labelling. Are you curious about how AI can revolutionise cancer detection? In this interview, Morteza shares insights into his work, the challenges of AI in medical imaging, and how his studies at HSLU have shaped his approach.

Shortcuts: Interview | Info-Events | Programme Information | Contact | Professional Data Science Portraits

Morteza Kiani Haftlang, a graduate of the MSc in Applied Information and Data Science at Lucerne University of Applied Sciences and Arts, conducted his thesis at IMAI MedTec on AI-driven medical imaging for cancer detection.

Morteza Kiani Haftlang, a graduate of the MSc in Applied Information and Data Science at Lucerne University of Applied Sciences and Arts, conducted his thesis at IMAI MedTec on AI-driven medical imaging for cancer detection.


Introduction

First of all, tell us something about yourself: What hashtags best describe you?

#Learner #Multidisciplinarity #Cook #AGIEnthusiast

Tell us more about the hashtags.

Each of the hashtags reflects an essential part of my personality and professional journey:

  • #Learner: Learning constantly is essential in an era of emerging technologies. I strive to keep up with mainstream AI trends, continuously learning and exploring new concepts in AI, healthcare and other fields.
  • #Multidisciplinarity: My goal is to connect multiple fields, from AI and medical imaging all the way to engineering.
  • #Cook: Cooking is my passion. Just like in AI, combining the right ingredients – data, models, and algorithms – leads to the best outcomes.
  • #AGIEnthusiast: I am obsessed with the future of artificial general intelligence and its potential to transform industries, particularly healthcare.

About your job: What did you do at IMAI MedTec?

At IMAI MedTec, my thesis focused on a comparative study of various models, particularly self-supervised AI models, to improve cancer detection from 3D histopathological images. In essence, the goal is to identify and label cancerous cells accurately. My work involves researching, training, and fine-tuning deep learning models that help pathologists to analyse tissue samples more accurately and efficiently. By reducing the need for manual annotations, we can make cancer screening faster and more precise.

What did you do before? 

Before my thesis, I did an internship at Roche, where I was involved in data engineering and data analysis, working with data from production lines and blood sensors. Prior to Roche, I was an electrical engineer. So, you can see how my career has changed. I switched to healthcare AI because I wanted to apply my skills to a field where technology can save lives. The opportunity to work on innovative medical imaging at IMAI MedTec was too exciting to pass up – so that’s how my project started.

The project

Please tell us about your research project.

My research focused on self-supervised deep learning models for detecting cancer in 3D light sheet microscopy (LSM) images. Traditional histological analysis often overlooks cancerous cells due to limited tissue sampling, which can result in false negatives – up to 20% of cases may miss cancer cells. By leveraging AI, we aim to analyse entire tissue samples in 3D, thus reducing the risk of missed diagnoses.

We applied the models considered in this study – U-Net, BTUNet, YOLOv8x-seg, YOLOv8x+SAM, and HoverNet – to a dataset that IMAI provided. The project compared multiple models to find the optimal balance between accuracy and efficiency in segmenting cancerous cells.

What data and method did you use, and what insights did you gain or do you hope to gain?

We worked with high-resolution 3D LSM images of histopathological tissue samples. Because of their size (sometimes up to 100 GB) and complex content, these images have to be pre-processed as well as normalised, augmented and adapted to formats that are friendly to deep learning.

We evaluated five key models:

  • U-Net: A classic segmentation model widely used in biomedical imaging.
  • BTUNet: A self-supervised learning version of U-Net utilising Barlow Twins.
  • YOLOv8x-seg: A real-time segmentation model optimised for speed.
  • YOLOv8x+SAM: A hybrid model incorporating the Segment Anything Model (SAM).
  • HoverNet: A powerful dual-task model designed for histopathology.
Research Process

Results and Findings

How can your insights help our society?

Our research provides valuable insights that can significantly enhance cancer detection and diagnosis. Among the models we evaluated, HoverNet demonstrated the highest segmentation accuracy, making it the most reliable choice for precise cancer detection. BTUNet excelled in handling limited labelled data, proving the effectiveness of self-supervised learning while delivering more stable prediction results. Meanwhile, YOLOv8x-seg stood out for its speed, making it a strong candidate for real-time applications, though with a slight trade-off in segmentation accuracy.

Early cancer detection saves lives. By improving segmentation accuracy, enabling better auto-labelling, and reducing reliance on manual annotation, our research contributes to:

  • Increased diagnostic precision, minimising the risk of false negatives.
  • Automation of tedious tasks, allowing pathologists to focus on more complex cases.
  • Enhanced accessibility, making advanced diagnostics feasible even in low-resource settings.
Cancer Labeling

What are your goals for your project in future?

As I look ahead, I see several key directions in which to develop this project further. One major focus is to make the model more robust by training on a more diverse dataset to ensure better generalisation across different tissue types and conditions. Additionally, optimising model architectures for real-time deployment will help reduce processing times, making AI-assisted diagnostics faster and more efficient. Another exciting approach is integrating multi-modal imaging data; for example, by combining MRI with histopathology to provide a more comprehensive analysis of cancerous tissues. Ultimately, the goal is to apply AI-assisted diagnostic tools in real-world clinical settings and to bridge the gap between research and practical medical applications and thus improve patient care.

How did your studies in the Applied Information and Data Science programme influence the project?

My background in engineering, AI and deep learning provided the technical foundation for this research. Additionally, my studies at HSLU helped me develop a more structured problem-solving approach, which was crucial when working with large-scale medical datasets and conducting deep-learning experiments. More importantly, though, the invaluable support from my supervisor, Dr Umberto Michelucci, pointed me in the right direction and played a key role in shaping this project.

 

What advice would you give to others starting on similar projects?

Understanding the domain is crucial – working closely with medical experts ensures that AI models align with real-world needs and address practical challenges. At the same time, data quality is just as important as the model architecture. After all, preparing data from scratch can be as demanding as extracting oil from an offshore field, requiring significant effort and precision. Experimentation and iteration play a key role in improving performance, making it essential to try different models, loss functions, and augmentation techniques while also learning from existing research and best practices in the field. Lastly, patience is vital, as data-driven projects in med-tech are often time-consuming due to complex data structures and ethical considerations. But persistence and careful refinement will ultimately help you make meaningful progress.

And finally: What new hashtag are you aiming for in future?

#CollectiveLearning: I truly believe that progress in AI isn’t an individual pursuit – it’s a collective journey. By sharing research, collaborating across disciplines and keeping an open mind, we can learn from each other and move forward together. AI has the power to transform our world, but only if we build it transparently, inclusively and ethically. I want to be part of a future where knowledge is shared rather than hoarded and will improve everyone’s lives through the power of AI. Only by working hand in hand can we create an AI-powered world that benefits everyone.

We want to thank Morteza Kiani Haftlang for his dedication and for sharing these valuable insights.

 


Data is the resource of the 21st century!
Register and join us for a free online Information-Event:

Monday, 11 August 2025 (Online, English)
Monday, 8 September 2025 (Online, German)
Monday, 6 October 2025 (Online, English)
Monday, 3 November 2025 (Online, German)

Programme Info: MSc in Applied Information and Data Science
More Field Reports & Experiences: Professional portraits & study insights
Frequently Asked Questions: FAQ

Why Data Governance Is Essential for Responsible AI Adoption

Why Data Governance Is Essential for Responsible AI Adoption

Data governance is essential for enabling reliable, ethical, and scalable AI by ensuring data quality, transparency, and compliance. Without it, AI systems risk producing biased, inaccurate, or non-compliant outcomes that can harm trust and organizational integrity. Strong governance provides the foundation for responsible AI adoption and long-term success.

Shortcuts:
Foundation | Explanation | Quality | Transparency | Privacy | Access | Risks | Compliance | Conclusion | Info-Events | Programme Information | Contact

Dr Dimitrios Marinos, our lecturer at HSLU, has deep expertise in artificial intelligence, big data analytics, digital transformation, AI ethics, data governance and more.

Dr Dimitrios Marinos, our lecturer at HSLU, has deep expertise in artificial intelligence, big data analytics, digital transformation, AI ethics, data governance and more.


Why AI Needs a Strong Data Foundation

In the era of digital transformation, artificial intelligence (AI) stands as a defining force reshaping industries, economies, and everyday life. From personalized recommendations to predictive maintenance and automated decision-making, AI has moved from concept to practical implementation across sectors. However, the effectiveness and trustworthiness of AI systems hinge on one crucial foundation: data. And at the core of managing data effectively lies data governance. Without it, AI becomes a high-risk endeavor susceptible to bias, inefficiency, and even regulatory consequences.

What Is Data Governance – and Why It Matters More Than Ever

Data governance refers to the overarching framework for managing data availability, usability, integrity, and security within an organization. It ensures that data is accurate, consistent, and trustworthy. In the context of AI, where algorithms rely on massive datasets to learn patterns and make decisions, the role of data governance becomes even more critical. Without quality data, AI systems can produce unreliable results, perpetuate biases, or fail entirely. Data governance provides the guardrails to ensure that data is not only clean and organized but also used ethically and in compliance with legal standards.

Data Governance Necessities: The 5 Pillars for Responsible AI
Data Governance Necessities: The 5 Pillars for Responsible AI

Enabling Data Quality and Model Reliability

One of the most important contributions of data governance to AI development is enabling data quality and consistency. AI models, particularly those using machine learning, are only as good as the data they are trained on. Inconsistent or inaccurate data can lead to flawed insights or unpredictable behavior. By establishing rules around data entry, classification, and lineage, data governance ensures that organizations maintain high standards of data quality throughout their pipelines. This allows AI to function with higher accuracy and greater reliability.

Promoting Transparency and Accountability in AI

Another area where data governance proves vital is in ensuring transparency and accountability. As AI systems increasingly impact decisions in areas such as finance, healthcare, and criminal justice, there is growing demand for explainability. Stakeholders, including regulators and consumers, want to understand how decisions are made. Data governance supports this by enforcing documentation of data sources, transformation processes, and access controls. When integrated into AI workflows, these governance mechanisms allow organizations to trace decisions back to the underlying data, thereby enhancing trust and supporting regulatory compliance.

Safeguarding Privacy and Security

Security and privacy are also deeply intertwined with data governance, especially when AI systems handle sensitive personal or financial information. Proper data governance frameworks help organizations classify data based on sensitivity, enforce access controls, and comply with data protection laws such as the GDPR or CCPA. In AI systems, which often aggregate and analyze vast quantities of personal data, this governance ensures that privacy risks are mitigated and ethical standards are upheld.

Empowering Innovation Through Controlled Access

Moreover, data governance enables data democratization without sacrificing control. In many organizations, AI projects are driven by different teams across departments. With a sound governance framework in place, organizations can empower these teams with the right data while maintaining oversight. This accelerates innovation and reduces the friction that often arises from siloed or inaccessible data.

The Consequences of Neglecting Data Governance

Conversely, the absence of data governance during AI adoption can lead to a cascade of issues. Poor data quality can cause models to misfire, potentially resulting in financial loss or reputational damage. For example, an AI system trained on biased data may make discriminatory decisions, leading to public backlash or legal challenges. Additionally, without clear data ownership or access policies, data silos may proliferate, hampering collaboration and scalability of AI initiatives.

Compliance Risks and the “Black Box” Problem

The lack of governance also increases the risk of non-compliance with data protection laws. AI systems often process personal data, and without appropriate controls, organizations risk violating regulations, facing heavy fines, or suffering data breaches. Moreover, in the absence of documentation and metadata management, AI outputs become black boxes – difficult to audit, explain, or improve. This opacity erodes trust among users and stakeholders and undermines the long-term viability of AI solutions.

Governance First, Then AI

Organizations seeking to harness the power of AI must first ensure that their data is governed with rigor and foresight. Without this foundation, the promise of AI can quickly turn into a perilous journey marked by errors, inefficiencies, and missed opportunities.

We would like to thank Dr Dimitrios Marinos for his dedication and for sharing these valuable insights.

 


Data is the resource of the 21st century!
Register and join us for a free online Information-Event:

Monday, 11 August 2025 (Online, English)
Monday, 8 September 2025 (Online, German)
Monday, 6 October 2025 (Online, English)
Monday, 3 November 2025 (Online, German)

Programme Info: MSc in Applied Information and Data Science
More Field Reports & Experiences: Professional portraits & study insights
Frequently Asked Questions: FAQ

Professional Data Science Portrait with Alexis Lüthi: Data Engineer & Digital Solutions Manager, Schindler

Professional Data Science Portrait with Alexis Lüthi: Data Engineer & Digital Solutions Manager, Schindler

Alexis Lüthi understands data flows, digital solutions and the potential hidden in automation. As a Data Engineer & Digital Solutions Manager at Schindler Elevators Ltd (Switzerland), he brings structure to data chaos and pursues the clear goal of creating value. A graduate of the MSc in Applied Information and Data Science programme at Lucerne University of Applied Sciences and Arts, Alexis has an unconventional career path – from the hotel business into the world of BI, RPA and the Azure Data Factory. This professional portrait explains why he finds the combination of technology and business so exciting.

Shortcuts: Interview | Info-Events | Programme Information | Contact | Professional Data Science Portraits

Alexis Lüthi, Data Engineer & Digital Solutions Manager at Schindler and a graduate of the MSc in Applied Information and Data Science at Lucerne University of Applied Sciences and Arts

Alexis Lüthi, a graduate of the MSc in Applied Information and Data Science at Lucerne University of Applied Sciences and Arts, currently works as a Data Engineer & Digital Solutions Manager at Schindler Elevators Ltd.


First of all, please tell us something about yourself: What hashtags best describe you?

#DataEngineering #DigitalTransformation #Innovation #BI #Automation

Tell us more about the hashtags.

My daily work revolves around data, digital processes and automation. As a data engineer, I structure data in a way that helps businesses make better decisions. That’s why I find digital transformation to be so exciting because it allows me to replace current processes with new technologies and approaches. Business intelligence (BI) is essential for data-driven decisions, and automation helps to minimise repetitive tasks and increase efficiency. 

Let’s talk about your professional activities: What do you do as Data Engineer & Digital Solutions Manager at Schindler Elevators Ltd?

I’m responsible for developing and implementing digital solutions at Schindler Elevators Ltd in Switzerland, where I lead projects in data engineering, BI, AI and robotic process automation (RPA). My focus is on optimising data flows, automating processes and providing data analytics to help the company make better decisions.

What did you do before, and why did you join Schindler Elevators Ltd?

I originally worked in the hospitality industry and graduated from the Hotel Management School in Lucerne. During the COVID-19 pandemic in 2020, I decided to change careers and thus discovered my passion for data and digital processes. The Data Science and Data Engineering programme gave me a lot of experience with IT projects, and I eventually got the opportunity at Schindler to develop innovative, data-driven solutions. 

What’s the most exciting part of your job?

The variety! Each day is different – I collaborate with different business units, develop new digital solutions, and analyse complex data structures. It’s especially exciting when raw data is turned into real value for the company. I also love using innovative technologies like Azure Data Factory, Databricks, Power BI, or RPA tools to optimise processes.

Which data scientist skills are particularly in demand in your job as Data Engineer & Digital Solutions Manager?

In my role, SQL and Python skills are essential, as I work with databases and develop ETL processes on a daily basis. Additionally, cloud computing skills (especially Azure) and experience in business intelligence (Power BI, Qlik) are important. But it’s also crucial to be able to think analytically, understand business processes and know how to design complex data models.

Do you see yourself more as a techie, an analysis freak, a creative genius, a management superhero or a brilliant all-rounder?

I’d describe myself as a mix between a techie and a data nerd. I love experimenting with new technologies and developing solutions, but I also really enjoy deep-diving into data analysis to uncover new insights.

What fascinated you most about the MSc in Applied Information and Data Science programme?

The combination of theory and practice – especially the projects where we analysed real datasets and developed solutions. During my studies, my horizons broadened a lot in a short time. The modules offered were up to date and very practical, which made it easier to apply what I learned directly in my professional life.

 

What are the biggest challenges in your job right now?

One major challenge is keeping up with the rapid pace of technological change. There are constantly new tools, frameworks and methods that could potentially be valuable to our work. Also, aligning business requirements with technical possibilities often requires a lot of communication and persuasion.

What advice would you give to someone who wants to do the same thing as you?

Being open to trying things out! Data engineering and digital transformation are incredibly dynamic fields, so it pays to keep learning and experimenting with new technologies. It’s also important to develop a solid understanding of business processes in order to apply data-driven solutions in a meaningful way.

Finally, what new hashtag are you aiming for in future? 

#AIIntegration – artificial intelligence will play an even bigger role in the coming years, and I want to develop my skills in this area further to create innovative solutions.

We would like to thank Alexis Lüthi for his dedication and for sharing these valuable insights.

 


Data is the resource of the 21st century!
Register and join us for a free online Information-Event:

Monday, 11 August 2025 (Online, English)
Monday, 8 September 2025 (Online, German)
Monday, 6 October 2025 (Online, English)
Monday, 3 November 2025 (Online, German)

Programme Info: MSc in Applied Information and Data Science
More Field Reports & Experiences: Professional portraits & study insights
Frequently Asked Questions: FAQ

Digital Companion: AI is redefining how we approach social support

Digital Companion: AI is redefining how we approach social support

Could AI-powered chatbots become a meaningful way to combat loneliness among older generations? Dr. Guang Lu is committed to finding out. Together with a team of Applied Data Science students from HSLU have developed an extraordinary AI chatbot.

Shortcuts:
Info-Events | Programme-Information | Contact | Professional Data Science Portraits

Dr. Guang Lu, lecturer of the MSc in Applied Information and Data Science at the Lucerne University of Applied Sciences and Arts.

Dr. Guang Lu, lecturer of the MSc in Applied Information and Data Science at the Lucerne University of Applied Sciences and Arts.



Dr. Mirjam Stieger, lecturer of the MSc in Applied Information and Data Science at the Lucerne University of Applied Sciences and Arts.

Dr. Mirjam Stieger, lecturer of the MSc in Applied Information and Data Science at the Lucerne University of Applied Sciences and Arts.



Richard Moist, lecturer of the MSc in Applied Information and Data Science at the Lucerne University of Applied Sciences and Arts.

Richard Moist, lecturer of the MSc in Applied Information and Data Science at the Lucerne University of Applied Sciences and Arts.



Dr. Martin Biallas, lecturer of the MSc in Applied Information and Data Science at the Lucerne University of Applied Sciences and Arts [no picture].



AI as a Digital Companion for Emotional Support

The AI chatbot possess the unique ability to combine rule-based, retrieval-based, and generative responses, selecting the most suitable one for each interaction. This innovation allows the chatbot to tailor its responses based on the older person’s context and emotional state, offering both pre-defined comforting phrases and dynamically generated empathetic messages through advanced NLP models.

From Pandemic Isolation to Digital Connection: The Evolution of an AI-Powered Chatbot

Dr. Guang Lu conceived this idea during the pandemic, witnessing the isolation many older individuals faced. Inspired to create a digital AI companion to alleviate loneliness, the team collaborated with local organizations such as Viva Luzern, Spitex Kriens, and Luzern60Plus. These groups played an integral role in shaping the chatbot’s design through co-creation workshops, ensuring that the final product met the specific needs of older users.

Developing the AI chatbot was a team effort, beginning with an in-depth analysis of user requirements. The team then worked through multiple iterations of system architecture, incorporating feedback from older adults and their caregivers. Their goal is to refine the chatbot into a fully-fledged AI-powered digital companion with expanded functionality. To achieve this, they are now seeking additional funding from the Velux Foundation and Innosuisse.

We would like to thank Dr. Guang Lu, Dr. Mirjam Stieger, Richard Moist and Dr. Martin Biallas for sharing their fascinating project!

Health Science x Data Science Series

This article is part of our “Health Science x Data Science” series, where we explore how data-driven innovation is transforming healthcare.

Learn more about the initiative on our project page: Health Science x Data Science

Check out the other articles in the series:

 


Data is the resource of the 21st century!
Register and join us for a free online Information-Event:

Monday, 11 August 2025 (Online, English)
Monday, 8 September 2025 (Online, German)
Monday, 6 October 2025 (Online, English)
Monday, 3 November 2025 (Online, German)

Programme Info: MSc in Applied Information and Data Science
More Field Reports & Experiences: Professional portraits & study insights
Frequently Asked Questions: FAQ

Smart Reha: Entering a New Era of Stroke Recovery with AI

Smart Reha: Entering a New Era of Stroke Recovery with AI

Could sensor data shorten rehab after a stroke? Daniele Buson, a graduate of the Master’s programme in Data Science at HSLU, believes it’s possible. He is dedicated to SmartVNS – a brain stimulator powered by AI (Artificial Intelligence) and worn in the ear that aids stroke recovery during everyday activities.

Shortcuts:
Info-Events | Programme-Information | Contact | Professional Data Science Portraits

Daniele Buson, graduate of the MSc in Applied Information and Data Science at the Lucerne University of Applied Sciences and Arts.



Revolutionizing Stroke Rehab with Data, AI, and Smart Wearables

Daniele’s project aims to digitally integrate vast amounts of data from physical devices and make them accessible on a treatment platform. This innovation enables stroke patients to benefit from state-of-the-art neuroscience rehabilitation at home – personalized, professionally supported, AI-enhanced and cost-effective.

Understanding needs before building solutions

“Creating a digital solution starts with understanding the problem,” explains Daniele. The first step to a digital solution was identifying the needs of patients and therapists, ensuring that both perspectives were addressed. Using iterative coding and discussions, Daniele and his team refined the SmartVNS system to align with rehabilitation requirements. By leveraging machine learning, AI, and wearable technology, they created a platform that supports stroke recovery in a seamless and accessible way.

We would like to thank Daniele Buson for sharing his fascinating project!

Health Science x Data Science Series

This article is part of our “Health Science x Data Science” series, where we explore how data-driven innovation is transforming healthcare.

Learn more about the initiative on our project page: Health Science x Data Science

Check out the other articles in the series:

Stay tuned for more insights at the intersection of health and AI-powered data science!

 


Data is the resource of the 21st century!
Register and join us for a free online Information-Event:

Monday, 11 August 2025 (Online, English)
Monday, 8 September 2025 (Online, German)
Monday, 6 October 2025 (Online, English)
Monday, 3 November 2025 (Online, German)

Programme Info: MSc in Applied Information and Data Science
More Field Reports & Experiences: Professional portraits & study insights
Frequently Asked Questions: FAQ

Painless Glucose Monitoring: How AI transforms Diabetes Care

Painless Glucose Monitoring: How AI transforms Diabetes Care

Data Science takes us closer to painless health monitoring. Could we do blood glucose testing without pricking our skin, using artificial intelligence instead? Yasmine Mohamed thinks so. During her MSc in Applied Information and Data Science at Lucerne University of Applied Sciences and Arts, she trained machine learning models to monitor glucose levels.

Shortcuts:
Info-Events | Programme-Information | Contact | Professional Data Science Portraits

Yasmine Mohamed, graduate of the MSc in Applied Information and Data Science at the Lucerne University of Applied Sciences and Arts

Yasmine Mohamed, graduate of the MSc in Applied Information and Data Science at the Lucerne University of Applied Sciences and Arts.



Revolutionizing Health: AI-Powered Glucose Monitoring Without the Pric

Yasmine’s project aims to predict hypoglycemia – a condition that can be dangerous during sleep – by using data from commercially available wearable devices, a continuous glucose monitor (CGM), and diary notes and by training machine learning models with the data.

Artificial Intelligence (AI) helps her to rethink health monitoring. Around half a million people in Switzerland have diabetes, requiring regular blood sugar measurements through blood samples or body-attached devices. But could smartwatches or similar devices soon be programmed to warn them about hypoglycemia?

Data Sources and Data Collection for blood sugar measurements

The algorithms developed in Yasmine’s Master’s thesis aim to provide non-invasive alerts for hypoglycemia. When integrated into smartwatches, these algorithms could offer people with diabetes a valuable new tool to manage their health.

Click here to check out Yasmine’s project!

We would like to thank Yasmine Mohamed for sharing her fascinating project!

Health Science x Data Science Series

This article is part of our “Health Science x Data Science” series, where we explore how data-driven innovation is transforming healthcare.

Learn more about the initiative on our project page: Health Science x Data Science

Check out the other articles in the series:

Stay tuned for more insights at the intersection of health and AI-powered data science!

 


Data is the resource of the 21st century!
Register and join us for a free online Information-Event:

Monday, 11 August 2025 (Online, English)
Monday, 8 September 2025 (Online, German)
Monday, 6 October 2025 (Online, English)
Monday, 3 November 2025 (Online, German)

Programme Info: MSc in Applied Information and Data Science
More Field Reports & Experiences: Professional portraits & study insights
Frequently Asked Questions: FAQ

Tactics meet AI – when algorithms play a part in sport

Tactics meet AI – when algorithms play a part in sport

Can algorithms and AI read a football match? Can data tell us who the true playmakers are – and under how much pressure a pass was made? These are the kinds of questions explored in Episode 7 of Applied Data Science Unboxed, the podcast of the Lucerne University of Applied Sciences and Arts.

Podcast: Applied Data Science Unboxed
Episode 7: Tactics meet AI – when algorithms play a part in sport
Host: Fabio Sandmeier
Guests: Martin Rumo and Beat Suter, scoreTec

Shortcuts:
Positioning | Pressing | Playmakers | Tactics | Translation | Takeaway

Host Fabio Sandmeier visits the small yet powerful sports analytics company scoreTec, run by Martin Rumo and Beat Suter, to learn how raw data becomes tactical intelligence. The answer? Not just clever algorithms, but also lots of coffee chats, flipcharts, and a deep understanding of the game.

01:16 – XY Coordinates & Position Data: How Algorithms Read the Game

At the heart of sport analytics lies tracking data: XY coordinates that pinpoint the position of each player and the ball on the field. However, scoreTec’s work begins with this data, not for the numbers alone, but rather to understand the context behind them.

That’s why their workspace looks more like a strategy lab than a tech hub. Flipcharts, sketches, and printed match visuals dominate the room. A setup which reflects their core approach, translating game understanding into data structures.

Listen to this part (01:16)

03:09 – Pressing in Football: The Data Behind Pressure on the Ball

One of the episode’s standout stories involves defining pressing. Coaches intuitively understand when and how pressure is applied during a match. However, translating that into data took time. A spontaneous coffee conversation finally brought clarity: pressing involves identifying the 2–3 nearest players and their speed toward the ball.

This insight led to the development of a Pressing Index – a way to quantify pressure moments using tracking data and linear algebra. And most importantly, it helped answer a recurring question from coaches: Under how much pressure was a pass made?

Listen to this part (03:09)

06:13 – Playmakers in Ice Hockey: What Makes a True Playmaker?

In another project, scoreTec worked with HC Davos to analyze the role of the playmaker in ice hockey. But what does be a playmaker actually mean?

As a result of close collaboration, they defined a playmaker as someone who passes the puck into a better situation. Closer to the goal, with more speed and less defensive pressure. Using positional and performance data, they created metrics to identify these players and presented their findings at the Spengler Cup.

Listen to this part (06:13)

08:08 – From Game Philosophy to Data Pipeline: Making Tactics Measurable

The Swiss Football Association provides a simple yet powerful framework: divide the game into four phases (in possession, losing possession, out of possession, regaining possession) and three zones (defensive third, midfield, attacking third). scoreTec uses this logic to structure their analytics pipeline.

While the starting point is raw tracking data, they extract higher-value event data such as ball recoveries, passes, and transitions. These are visualized and made available to analysts, sometimes even live during the match, enabling coaches to make better decisions without spending sleepless nights with Excel.

Listen to this part (08:08)

12:20 – Analytics Translation in the IDS Master’s Program: From Data to Answers

These same methods are part of the curriculum in the MSc in Applied Information and Data Science (IDS) at Lucerne University of Applied Sciences and Arts. Martin Rumo teaches students how to build data pipelines, extract meaningful events, and above all ask the right questions.

Moreover, in practical modules, students work with real experts like strength and conditioning coaches. Their task? To translate raw data into actionable insight. The secret is good listening, clear communication, and iterative co-creation.

Listen to this part (12:20)

Final Takeaway: Data Is Just the Beginning

Ultimately, if this episode makes one thing clear: Data is not the goal, insight is. Great sports data analytics doesn’t start with algorithms – it starts with the right question. And when tactics, data, and a touch of creativity come together, that’s when the real magic happens.

 


Data is the resource of the 21st century!
Register and join us for a free online Information-Event:

Monday, 11 August 2025 (Online, English)
Monday, 8 September 2025 (Online, German)
Monday, 6 October 2025 (Online, English)
Monday, 3 November 2025 (Online, German)

Programme Info: MSc in Applied Information and Data Science
More Field Reports & Experiences: Professional portraits & study insights
Frequently Asked Questions: FAQ

Avoiding Fraud & PR Crisis with AI

Avoiding Fraud & PR Crisis with AI

AI is revolutionizing how businesses tackle risks, from preventing financial fraud to boosting efficiency and protecting reputations. Companies like JPMorgan Chase, General Electric, and Facebook use AI to detect fraud, streamline operations, and manage public perception. As AI evolves, businesses that embrace it will stay ahead, reduce risks, and thrive.

Shortcuts:
Intro | Financial losses | Efficiency | Reputation | Legal risks | Future | Info-Events | Programme Information | Contact

Dr Dimitrios Marinos, our lecturer at HSLU, has deep expertise in artificial intelligence, big data analytics, digital transformation, AI ethics, data governance and more.


AI: The key to business success in a fast-paced world

In today’s fast-moving world, businesses are turning to artificial intelligence (AI) to stay ahead of risks, save money, and keep things running smoothly. Whether it’s stopping financial losses, making operations more efficient, protecting their reputation, or staying on the right side of the law, AI is becoming an essential tool for companies looking to avoid big headaches and stay competitive.

Stopping financial losses before they happen

Nobody likes losing money, and AI is proving to be a game-changer in stopping financial losses before they happen. Take fraud detection, for example – banks and financial institutions are using AI to spot suspicious transactions in real-time. Machine learning algorithms analyze massive amounts of data to flag anything that looks unusual. JPMorgan Chase, for instance, uses AI-powered fraud detection systems to scan transactions and prevent fraud before it can cause any damage. Additionally, it also helps businesses make smarter investment decisions by predicting market trends and assessing credit risks, reducing costly mistakes.

Boosting efficiency with AI-driven operations

Operational inefficiencies can drain businesses, but AI helps reduce wasted time and resources. It automates repetitive tasks, streamlines workflows, and improves decision-making. Predictive analytics help companies forecast demand and manage supply chains more effectively. This prevents unnecessary disruptions and optimizes operations. AI-powered chatbots and virtual assistants make customer service more efficient by handling simple queries. This frees up human agents for more complex issues. General Electric, for example, uses AI-driven predictive maintenance to monitor equipment and prevent unexpected breakdowns. This keeps everything running smoothly and saves millions in repair costs.

Protecting brand reputation with AI

Reputation is everything in business, and AI helps companies keep a close eye on how they’re perceived by the public. By analyzing social media, news articles, and customer reviews, sentiment analysis tools can detect potential Public Relations (PR) crises before they escalate. Brands can respond quickly to negative feedback and protect their image. Facebook, for example, uses AI to filter out harmful content and misinformation, ensuring that its platform remains a safe space for users. Fact-checking technologies also help businesses verify information, preventing the spread of false content that could harm their credibility.

Staying compliant: AI’s role in legal and regulatory matters

Legal and compliance issues are a constant concern, especially in heavily regulated industries. AI helps companies stay compliant by scanning legal documents, industry regulations, and internal policies to ensure everything is in line with the latest requirements. Natural language processing (NLP) tools analyze contracts and flag potential legal risks before they become major problems. AI-driven security solutions also help businesses protect sensitive customer data, ensuring compliance with privacy laws like GDPR. IBM Watson, for example, provides AI-powered compliance solutions to financial institutions, helping them navigate complex regulations and avoid costly penalties.

The future is now: why businesses can’t ignore AI

At the end of the day, AI is no longer just a futuristic concept – it’s a must-have tool for businesses looking to reduce risks, cut costs, and stay ahead of the game. From stopping financial fraud to improving efficiency, managing reputation, and staying legally compliant, AI is changing the way companies operate. As technology keeps advancing, businesses that embrace AI will be the ones best equipped to adapt, grow, and thrive in the digital age.

We would like to thank Dr Dimitrios Marinos for his dedication and for sharing these valuable insights.

 


Data is the resource of the 21st century!
Register and join us for a free online Information-Event:

Monday, 11 August 2025 (Online, English)
Monday, 8 September 2025 (Online, German)
Monday, 6 October 2025 (Online, English)
Monday, 3 November 2025 (Online, German)

Programme Info: MSc in Applied Information and Data Science
More Field Reports & Experiences: Professional portraits & study insights
Frequently Asked Questions: FAQ

The Price Companies Pay for Poor Data Governance

The Price Companies Pay for Poor Data Governance

Lack of data governance leads to financial losses, operational inefficiencies, reputational damage and legal risks. Poorly managed data undermines decision making, limits innovation and undermines trust, making data governance essential for efficient, compliant and data-driven organisational success. "The consequences of this oversight can be significant," says Dr Dimitrios Marinos in his expert article.

Shortcuts:
Intro | Financial costs | Operational inefficiencies | Reputational damage | Legal risks | Missed opportunities | Cultural impact | Info-Events | Programme Information | Contact

Dr Dimitrios Marinos, our lecturer at HSLU, has deep expertise in artificial intelligence, big data analytics, digital transformation, AI ethics, data governance and more.


Data governance plays a critical role

In the age of digital transformation, data is widely regarded as a critical asset for any organisation, large or small. Today, organisations are leveraging data to drive growth, improve efficiency and enhance the customer experience. However, with the rapid influx of data, managing, securing and ensuring the quality of data is becoming increasingly complex. This is where data governance – the framework for managing data assets, ensuring data quality, data privacy and regulatory compliance – plays a critical role. Yet many organisations still operate without robust data governance. The consequences of this oversight can be significant, resulting in both obvious and hidden costs that can hinder an organisation’s success. This article explores the financial, operational and reputational costs of not having data governance, and why investing in it is a strategic imperative.

Financial costs

One of the most immediate and measurable costs of a lack of data governance is the financial impact of poor data quality. Without governance, data quality suffers from inaccuracies, inconsistencies and redundancies that accumulate as data flows through different systems. When people rely on faulty data to make strategic decisions, the likelihood of costly mistakes increases. For example, a targeted marketing campaign based on incorrect customer data could result in wasted resources, missed opportunities and even loss of customers. According to a Gartner 2022 study, poor data quality costs organisations an average of $12.9 million per year in inefficiencies, errors and missed opportunities.

Data breaches are another significant financial cost of poor data governance. Data governance includes protocols to ensure data security and compliance with privacy regulations such as GDPR, CCPA and others. Poor data governance leaves organisations vulnerable to data breaches, exposing sensitive customer and business information to cybercriminals. The financial impact of a data breach can be huge; in addition to potential fines for non-compliance, organisations must also cover the costs associated with customer notification, legal fees, and any operational downtime resulting from the breach. IBM’s 2021 Cost of a Data Breach Report found that the global average cost of a data breach has reached $4.24 million, the highest figure in 17 years. For smaller organisations, a breach could be catastrophic, potentially threatening their very survival.

Operational inefficiencies

Beyond the financial losses, the lack of data governance creates significant operational inefficiencies that disrupt workflows and reduce productivity. Without clear standards and protocols for handling data, employees waste time searching for data, verifying its accuracy or cleaning it up. Imagine an organisation where different departments store customer data in different formats in multiple systems with no integration. Sales, marketing and customer support staff can spend hours consolidating data, resulting in duplicated effort and frustration.

Inconsistent data also hinders automation efforts. Automation technologies, from predictive analytics to machine learning, rely on accurate and standardised data to work effectively. When data is unstructured or poorly managed, the insights generated are unreliable and the technology’s potential remains unfulfilled. These inefficiencies lead to higher operational costs, slower decision-making and stifled innovation – an opportunity cost that cannot be ignored in a competitive environment.

Reputational damage

In today’s connected world, a company’s reputation can be its most valuable asset. Consumers and stakeholders expect organisations to manage their data responsibly and protect it from breaches and misuse. Data breaches, inaccurate reporting and unauthorised data sharing due to poor data governance practices can damage an organisation’s reputation. If customers lose confidence in an organisation’s ability to protect their information, they are likely to take their business elsewhere.

A damaged reputation due to poor data governance goes beyond the loss of customers. Potential business partners, investors and regulators may view the organisation as a risky investment. A single data breach, for example, can damage stakeholder confidence and affect stock prices. In some cases, the reputational damage can be even more lasting than the financial loss, as it can take years for an organisation to rebuild its credibility.

Legal and regulatory risks 

As data privacy regulations proliferate around the world, organisations must comply with a complex web of requirements to protect consumer data. GDPR, CCPA, HIPAA and other regulations require organisations to collect, store and process data responsibly, often with severe penalties for non-compliance.

Data governance is essential to maintaining compliance with these regulations, ensuring that all data practices are aligned with legal standards. Without data governance, organisations may inadvertently violate these regulations and face penalties, lawsuits, and even court-ordered restrictions on their ability to collect and process data. Fines under the GDPR can be as high as €20 million or 4% of annual global turnover, whichever is greater. Non-compliance can be particularly devastating for small and medium-sized businesses, which lack the financial cushion to absorb such fines.

Missed opportunities and strategic limitations

Data is a strategic asset that provides insights to drive growth, improve products and enhance the customer experience. However, these benefits can only be realised if the data is accurate, accessible and actionable. Without governance, data silos develop across departments, isolating valuable information in disparate systems and preventing organisations from gaining a holistic view of their operations.

When data is poorly managed, it also limits an organisation’s ability to innovate. Inaccurate or outdated data undermines the effectiveness of analytics, machine learning and artificial intelligence initiatives. As a result, organisations are unable to leverage data-driven insights for competitive advantage. Conversely, organisations with robust data governance can rely on their data assets to make more informed decisions, target customers more effectively, and respond to market changes with agility.

 

Cultural impact: Erosion of trust in data

An often overlooked cost of poor data governance is the erosion of a data-driven culture. When employees encounter data inconsistencies or inaccuracies, their confidence in the data diminishes. They may resort to gut instinct or subjective decision-making, defeating the purpose of data-driven strategies. Over time, this scepticism can spread, making it difficult to foster a data-driven culture within the organisation.

A lack of data governance also hinders cross-departmental collaboration. With data inconsistencies and confusion over data ownership, collaboration suffers, fostering an environment of mistrust and inefficiency. Without a unified approach to data management, employees work in silos, reducing organisational agility and effectiveness.

The solution: Prioritizing data governance

The good news is that implementing data governance is a proactive measure that can prevent these costs. By establishing a clear data governance framework, organisations can improve data quality, ensure compliance, streamline operations, and foster a culture that values data accuracy and security. Data governance initiatives may require an initial investment in technology, people and training, but the return on investment is significant.

In addition, data governance is a dynamic process that evolves with changing regulations, technologies and business needs. When integrated into the company’s strategy, data governance becomes an asset that enables organisations to use data responsibly and effectively.

From financial loss to reputational damage, operational inefficiencies, legal risks and lost opportunities, the consequences of inadequate data governance touch every aspect of an organisation. A comprehensive data governance strategy is not just a best practice, it is a necessity. Investing in data governance not only protects an organisation’s data assets, but also enables it to use data as a competitive advantage, ensuring long-term success in a complex, data-intensive landscape.

We would like to thank Dr Dimitrios Marinos for his dedication and for sharing these valuable insights.


Data is the resource of the 21st century!
Register and join us for a free online Information-Event:

Monday, 11 August 2025 (Online, English)
Monday, 8 September 2025 (Online, German)
Monday, 6 October 2025 (Online, English)
Monday, 3 November 2025 (Online, German)

Programme Info: MSc in Applied Information and Data Science
More Field Reports & Experiences: Professional portraits & study insights
Frequently Asked Questions: FAQ

Professional Data Science Portrait with Tim Giger: Principal Data & AI Consultant, Swisscom

Professional Data Science Portrait with Tim Giger: Principal Data & AI Consultant, Swisscom

Who wants to do things halfway? Our data science graduate, Tim Giger, asks this question with a wink of the eye. For him, there is no such thing as "adequate." He devotes himself to his work as a Principal Data & AI Consultant at Swisscom with great attention to detail – and is very successful. Learn more about how varied Tim's job profile is and what plans this 'techie with a management touch' has for the future.

Shortcuts: InterviewInfo-EventsProgramme InformationContactProfessional Data Science Portraits

Tim Giger works as a Principal Data & AI Consultant at Swisscom and is a graduate of the Applied Information and Data Science Master’s programme @HSLU 


First, about you personally: What hashtags best describe you?

#curious, #open-minded, and #meticulous.
If I had to describe myself in three words, it would be the hashtags above. I’d use “curious” because I’m always looking for new ideas and challenges – I always like to discover new things. And I’d use “open” because I enjoy interacting with people and taking on new tasks, which I think makes me approachable. As for “meticulous”? Well, when I set my mind on something, I have to do it thoroughly. You could also call it my enthusiastic “obsession” with getting excellent results. After all, who wants to do things halfway?

Tell us more about the hashtags.

Curiosity is definitely what drives my work, as was the case during my studies. Whether it’s a new technology or a completely different way of tackling a problem, I want to fully understand what I’m doing and apply what I know. On the other hand, openness is the key because both in my studies and in my daily work I have to deal with many people, perspectives, and challenges, which are things I really appreciate. You need a certain openness to understand different perspectives. And then there’s meticulousness: when I dedicate myself to a task, I never find that something I’ve done is just adequate. I like to go into detail, and yes, I sometimes become really passionate about a project, which must have been a bit annoying for my fellow students at times.

 

Let’s talk about your professional activities: What do you do at Swisscom Data & AI Consulting?

I’m currently working as a Principal Data & AI Consultant at Swisscom. My job actually covers the full range – from strategy to architecture and solution design, all the way to implementing platforms and solutions in the data and AI field for customers in the Enterprise segment. Specifically, this can involve developing concepts for data architectures and platforms, setting up data warehouses and lakes, implementing ML models, or devising data strategies – always with a focus on the customer, of course. And all this always also includes sales activities, management tasks, and contributions to the team’s strategy development.

What did you do before, and why did you join Swisscom Data & AI Consulting?

My path into data science was, let’s say, not really planned. I originally worked in systems engineering, and then, by chance, I came across the world of data twelve years ago. All the incredible things you can do with simple 0s and 1s were a true revelation that fascinated me immediately. I knew right away that “this was it!” and that I wanted to continue to research, learn and get involved in this field.

What’s the most exciting part of your job?

The variety! Whether I’m developing solutions with customers in workshops or immersing myself in a technical problem, things never get boring. Of course, as a technician, I particularly enjoy the moment when a complex solution works properly and generates added value. However, the conversations and interactions with customers are always exciting. Sometimes, I feel like an “interpreter” between what’s technically possible and what customers actually need. It’s this balancing act that makes my work so engaging, which ultimately means devising a technical solution with the customer and then developing it.

Which data scientist skills are particularly in demand in your job?

You need to be good at analysing complex problems and requirements and coming up with technically sound solutions that the customer can understand and that are cost-effective. Of course, you also need solid technical skills, but your ability to communicate and correctly categorise and interpret requirements is just as important. The challenge often lies not so much in the technology or the model itself but in bridging the customer’s wishes with what is actually feasible. That’s the real art – and it’s what makes this job so exciting.

Do you see yourself more as a techie, an analysis freak, a creative genius, a management superhero or a brilliant all-rounder?

I’d say that I feel most at home in technology – so I’m definitely a techie. But that doesn’t mean I limit myself to that one area because I also enjoy managing projects, holding workshops, and giving talks. Sometimes I’m the techie in the quiet room where I immerse myself in the code, but other times I’m the all-rounder with my sleeves rolled up at the frontline. Maybe I’m a bit of a “techie with a management touch”, to put it differently.

What fascinated you most about the MSc in Applied Information and Data Science programme?

As a technology enthusiast, I was of course excited by the advanced analytical topics such as deep learning and natural language processing. But the programme had another and unexpected fascination for me, namely the diversity of people with different experiences who I got to work with. Everyone had a different background and perspective, which enriched our projects tremendously. The mix of technical challenges and human interaction was perfect for my personal and professional growth.

 

What are the biggest challenges in your job right now?

Wow, where should I start? One of the biggest challenges is definitely the rapid pace at which technology develops. You’ve hardly just come to grips with a new platform before another innovation pops up on the horizon. Then there are the dynamics of the customer environment – every project comes with new requirements, expectations and sometimes surprises. And finally there’s the team, which means knowing how to support and integrate our junior members while keeping an eye on what lies ahead, for example. Sometimes it feels like I’m juggling ten balls at once. But that’s what makes it all so interesting.

What advice would you give to someone who wants to do the same thing as you?

If you want to get started in data and AI, it’s not enough to just know the theory. You have to go out into the field, receive the ball and score the goal! We often say, “We’ll pass you the ball, but it’s you who has to score the goal. In other words, you need curiosity, initiative, and above all enthusiasm. It’s not a sprint but a marathon, and you’ll need plenty of energy to stay ahead of the game. But don’t worry, it’s worth it – every metre of it.

To wrap things up: What new hashtag are you aiming for in the future?

There’s definitely one hashtag that’s been on my mind for a while, and that’s #research. I have so much fun discovering and researching things that I’m now thinking about doing a PhD. Of course, it’s going to be a challenge to work and do research at the same time, but that’s exactly what appeals to me. Who knows, maybe I’ll soon be writing not only code but also papers! After all, it always makes sense to reach for the stars, doesn’t it?

We would like to thank Tim Giger for his dedication and for sharing these valuable insights.


Data is the resource of the 21st century!
Register and join us for a free online Information-Event:

Monday, 11 August 2025 (Online, English)
Monday, 8 September 2025 (Online, German)
Monday, 6 October 2025 (Online, English)
Monday, 3 November 2025 (Online, German)

Programme Info: MSc in Applied Information and Data Science
More Field Reports & Experiences: Professional portraits & study insights
Frequently Asked Questions: FAQ

fh-zentralschweiz