Between Promise and Prudence — Opening Remarks on AI Through an Islamic Ethical Lens

Opening keynote at Royal Perak Golf Club, Ipoh, for a national gathering of Islamic agencies: a practical framing of AI’s rapid advances, its real risks and how Islamic ethics should shape responsible adoption across religious institutions.

8/6/20252 min read

I had the honour of delivering the opening address to a national assembly of Islamic agencies on 6 August 2025 at the Royal Perak Golf Club, Jalan Raja Azlan Shah, Ipoh. Representatives from Jabatan Agama Islam Perak and religious bodies from across Malaysia joined to discuss how public-facing Islamic institutions should understand, engage with and govern rapidly advancing AI technologies.

Framing the moment
AI is not merely a technical trend — it reshapes how knowledge is found, how decisions are made, and how communities interact. In my talk I set out the dual nature of this moment: extraordinary opportunity to improve service delivery, education and access to religious guidance, but also genuine risks that demand careful stewardship.

The advances (what’s useful now)

  • Automation and productivity: rapid transcription, document summarisation, and routine administrative automation that frees staff to focus on higher-value duties.

  • Knowledge access: tools that can help organise large repositories (sermons, fatawa, training materials) for faster retrieval.

  • Educational tools: personalised learning aids, content generation for outreach and dakwah, and multimedia production that supports community engagement.

The risks (what to watch closely)

  • Context and correctness: generative models can produce plausible but incorrect religious or legal content; mistakes in religious guidance have high consequence.

  • Bias and disrespect: datasets may contain cultural or theological bias or generate content offensive to faith sensibilities.

  • Privacy and dignity: handling personal enquiries, counselling notes or marriage/divorce records requires strict data protection and confidentiality.

  • Social harm: misinformation, manipulation, or content that foments division or confusion (fitnah) must be actively prevented.

An Islamic ethical compass for AI

Key points I emphasised:

  • Intention and accountability (niyyah): deployments must have clear, justifiable objectives that serve the community’s welfare.

  • Prioritise preservation of faith (dīn) and intellect (ʿaql): any automated religious guidance needs human oversight and clear disclaimers.

  • Data minimisation and dignity: collect only what is necessary, secure it, and ensure confidentiality for sensitive cases.

  • Public benefit test: if a system cannot demonstrably avoid serious harms, defer or narrow its use.

Practical recommendations for Islamic agencies

  • Start small, validate early: pilot low-risk use cases (administrative automation, meeting summarisation, internal knowledge search) before scaling to advisory services.

  • Human-in-the-loop for religious outputs: all fatawa-like outputs must be reviewed by qualified scholars; AI can assist drafting but not replace scholars.

  • Sharia review and fatwa oversight: establish a cross-agency Sharia/tech committee to assess theological, legal and ethical implications.

  • Robust governance: data classification, access controls, audit logs, redress mechanisms and vendor due diligence (data residency, licence terms).

  • Capacity building: train imams, officers and IT teams on prompt literacy, risk spotting and incident response.

  • Community engagement: publish simple explanations for the public about where AI is used, how decisions are made, and how to appeal or query automated outputs.

Closing and next steps
The session was thoughtful and engaged, the questions reflected real operational concerns and a clear desire to combine religious integrity with modern tools. My closing message was cautious optimism: AI can amplify the good work of Islamic institutions, but only if technological adoption is wedded to values, scholarly oversight and sound governance.