Charting a Humane AI Course for HR 

By Goran Trajkovski, Ph.D. and Ashley Dugger, DBA, SHRM-CP

In an era where artificial intelligence is advancing rapidly, even into areas as profoundly human as HR, the need to ensure that AI aligns with our cultural values becomes increasingly imperative. Rather than seeing technology as a looming adversary threatening workplace trust, we introduce the HEART methodology as a visionary approach to transform intelligent systems into collaborative partners that support our highest ideals. 

By leveraging AI, HR professionals may find their time allotted to more strategic planning and tasks versus spending valuable time on more administrative or routine responsibilities. 

Unlocking Value-Based AI

The unchecked adoption of AI in HR poses a genuine threat to the essential human aspects of work, including nuanced judgment calls, understanding cultural nuances, and acknowledging emotional vulnerabilities. For instance, when recruiting algorithms focus solely on qualifications, they risk overlooking exceptional candidates. Similarly, AI’s interpretation of behavioral policies can fall short due to a lack of contextual awareness.

However, there’s a better way forward – adopting a balanced approach that combines automation with human collaboration. By infusing humanistic values into AI from its inception, drawing insights from fields like human-centered design, we can ensure that these systems align with our cultural values. This proactive strategy surpasses the need for reactive measures to control technology after it’s already in use. 

The HEART of the Matter

HEART, our proposed methodology for HR contexts, draws from moral philosophy and learning sciences to embed humanistic principles into AI engineering. It stands on five pillars:

Humanism: Anchoring AI in a deep respect for human autonomy and dignity by incorporating cultural values like justice into its very core, ultimately enhancing collective capability.

Ethics: Instilling accountability and trust through comprehensive traceability, auditing and external oversight mechanisms built into AI systems before issues arise.

Alignment: Forging a strong connection between AI capabilities and an organization’s cultural mission, ensuring they’re harmonized at the highest levels, not just driven by efficiency.

Responsiveness: Empowering AI to adapt dynamically to the ever-evolving workplace environment, learning about cultural needs, emotions and pain points as they evolve.

Trust: Establishing trust through transparent communication about AI’s purpose, limitations and data practices from the outset, while inviting constructive feedback.

Bringing Principles to Life: AI in Practice

Translating cultural values into AI systems requires more than mere rhetoric – it demands the incorporation of learning design techniques:

  • Prompt Engineering: HR professionals with technical acumen can shape AI by structuring prompts and responses that infuse cultural values into models. This interdisciplinary collaboration ensures inclusivity is embedded from day one.
  • Personalized Interactivity: Designing AI interactions that adapt to individual needs and contexts, similar to a skilled teacher, demonstrates respect for employees as the workplace evolves.
  • Participative Refinement: Extensive collaborative testing workshops that focus on identifying subjective pain points enable employees to actively shape and utilize AI over time. Gathering feedback from diverse groups and regions enables continual improvement.
  • Ongoing Alignment: Regular transparent reviews assess AI based on evolving criteria, including usefulness, inclusiveness and alignment with cultural values. This keeps systems in tune with changing workplace norms through updated training, rather than relying on opaque recalibration by isolated engineering teams.

A Moral Compass for Progress

Elevating intelligent technology to align with cultural values is not just a choice – it’s a moral duty centered on respecting human dignity. AI has the potential to transform workflows, but its role should be as an ally, not a disruptive force or a replacement for human judgment and relationships. With careful guidance, AI can become a collaborator that empowers employees, rather than a constraint with opaque protocols.

This journey will be ongoing and will require extensive cooperation across organizations, functions and disciplines. Yet, the initial steps, guided by compassion, pave the way for a harmonious coexistence of human and technological capabilities. The HEART framework offers a promising path for developing AI together with our cultural values, mutually reinforcing each other’s strengths.

Additionally, as we give ourselves and others grace to adapt, learn and do better, we have to approach the use of AI in the same manner – if our AI tools are not operating in the manner which we intended or they are not aligning to organizational and departmental goals and vision, it is up to us to adapt them so they work for us versus against us.

HR Leadership in the AI Era

HR leaders possess a unique opportunity to proactively shape AI design before its adoption accelerates further, and to educate their teams, leaders and employees on the planned and tactical usage of AI to earn early buy-in and support for these tools. Their expertise provides a perspective that efficiency-focused pioneers may overlook. By blending tradition with innovation, the HEART framework can help steer AI trajectories that elevate workplace cultures.

The future is yet to be written, and it awaits our collaborative authorship. Despite inevitable challenges, progress guided by ethical principles empowers us to create a world where hearts and technologies merge toward horizons beyond our current vision. With vision and values as our guiding stars, we embark on this transformative journey together.

Goran Trajkovski, Ph.D.
Lead Academic Program Manager – College of Business

Western Governors University
goran.trajkovski@wgu.edu
wgu.edu
Ashley Dugger, DBA, SHRM-CP
Associate Dean and Director-HR Management Programs

Western Governors University
ashley.dugger@wgu.edu
wgu.edu