Generative AI policies
Purpose
This policy establishes the principles for responsible and transparent use of Generative Artificial Intelligence (AI) tools in research and publication activities. It aligns with ethical standards recommended by the Committee on Publication Ethics (COPE), Elsevier, and WAME.
Acceptable Use of Generative AI Tools
Authors may use Generative AI tools (e.g., ChatGPT, Gemini, Copilot, Claude, etc.) only for limited and disclosed purposes, including:
- Improving grammar, language clarity, or readability
- Assisting with the organization or structure of a manuscript.
- Supporting data visualization or figure formatting, provided all underlying data are genuine and verified by the authors.
Authors must retain full responsibility for the content’s originality, accuracy, and ethical compliance.
Prohibited Uses
Generative AI tools must not be used for:
- Generating research content, analysis, or references presented as original work
- Creating fabricated or manipulated data, images, or citations
- Performing peer review or editorial decisions
- Being listed as an author or co-author, as AI systems cannot take accountability or grant copyright
IJIE considers the use of AI-generated content without disclosure as academic misconduct.
Disclosure Requirement
Authors must explicitly disclose any use of AI tools in the manuscript. Disclosure should be included in the Acknowledgment or under a separate heading such as “Declaration of Generative AI Use”, using the following template:
“The authors used [tool name, version] for [specific purpose, e.g., grammar checking or editing language]. The authors reviewed and revised the content and take full responsibility for the final manuscript.”
Failure to disclose AI usage may result in manuscript rejection or post-publication retraction.
Reviewer and Editor Use of AI
Editors and reviewers are prohibited from using Generative AI tools to read, analyze, or evaluate manuscripts, as such tools may compromise confidentiality, integrity, and objectivity of the review process.
Editorial staff may only use AI for administrative tasks such as formatting, grammar checking, or correspondence.
Accountability and Screening
All submissions may be screened using AI-content detection tools and plagiarism software. The corresponding author holds final responsibility for ensuring that all AI-generated assistance has been transparently disclosed.
Policy Review and Updates
This policy will be periodically reviewed and updated to align with evolving international ethical standards and indexing requirements.