This post explores the critical issue of students’ use of generative AI (GenAI) tools in professional workplace settings. This is especially impactful during internships. It highlights the potential risks associated with “shadow AI.” Students often unknowingly expose sensitive company data to third-party platforms. These platforms lack enterprise-level security. The article stresses that while universities develop AI policies for academic use, they have not adequately addressed work-integrated learning. This significant gap in guidance leaves both students and employers vulnerable to data breaches and reputational harm. Ultimately, the piece advocates for proactive measures. These include updating existing workplace documents and providing pre-placement training. These steps equip students with the ethical and practical knowledge necessary for responsible AI use in professional environments.

Resource: Dean, B., Nicola-Richmond, K., Tai, J., & Walton, S. (2025). Do your students know the consequences of AI use at work? Times Higher Education.

It’s safe to say that many university students are using generative artificial intelligence (GenAI) tools for learning, university assessments and personal use.

Most Important Aspects:

  1. Pervasive Student GenAI Use: Students are widely using GenAI for academic learning, assessments, and personal use. This comfort with the technology naturally extends to their work placements.
  2. “Shadow AI” in Work-Integrated Learning: A significant risk arises from students using GenAI tools without explicit workplace permission. They may not have workplace knowledge, which leads to a phenomenon referred to as “shadow AI.” This behaviour, often unintentional, can lead to serious data breaches.
  3. Unintentional Data Exposure: Students are not acting maliciously. They often lack awareness of the specific ethical boundaries, workplace policies, and legal implications (e.g., privacy legislation) of using GenAI with confidential information.
  4. Inadequate Security of Personal GenAI Accounts: The majority of students access GenAI tools through personal accounts. These accounts typically lack the robust enterprise-level security. They also do not offer the data governance required for handling sensitive business information.
  5. Motivations for Student GenAI Use: Students are driven by a desire for efficiency and to impress supervisors. Internships often involve steep learning curves and unfamiliar systems. There is also time pressure. This makes GenAI a tempting “smart, efficient move” to clarify tasks, summarise information, or draft content.
  6. Gap in Policy Translation: University GenAI policies for academic settings often do not adequately translate to the work-integrated learning space. Furthermore, host organisations may not have considered how student interns fit into their existing GenAI governance frameworks.
  7. Existing Structures for Solutions: The good news is that work-integrated learning programmes are well-positioned to address this issue by evolving existing structures.
  8. Updating Documentation: Essential documents like IP agreements, legal contracts, role descriptions, and codes of conduct can be updated to include explicit guidance on GenAI use.
  9. Pre-Placement Preparation: Online modules, workshops, or in-class sessions before placement are ideal for discussing ethical principles, risks, and responsibilities.
  10. Placement Orientation: The initial meeting between students and supervisors is critical. It is an opportunity to discuss GenAI use, boundaries, and available support.

How will this impact healthcare students as they transition into the employment? Maybe they will be the innovators to drive change and utilise the potential of AI, and not be retsricted by the traditional risk averse methods of change. Using AI for evidence based translation of knowledge into practice timeframes could be cut right down (have to be able to beat 17 years!).

Conclusion:

The integration of GenAI into work-integrated learning presents significant challenges. These challenges primarily stem from a lack of awareness regarding data security. There is also a lack of understanding about confidentiality and policy discrepancies between academic and professional environments. However, universities and host organisations can prepare students to use GenAI tools “ethically, responsibly and with purpose.” They can achieve this by proactively updating existing programme structures. Open communication is also fostered to mitigate risks and ensure a smooth transition into the future of work.

Resource: Dean, B., Nicola-Richmond, K., Tai, J., & Walton, S. (2025). Do your students know the consequences of AI use at work? Times Higher Education.

Leave a Reply