The Association for Computing Machinery, or ACM for short, is a well-known entity in computer science. They provide resources for plenty of well-regarded scientific conferences and publish their proceedings, in addition to other tasks such as organizing international programming contests and awarding the Turing Award, computer science’s equivalent of a Nobel Prize.
Unfortunately, ACM appears too trusting of new conferences and too slow in reacting to problems.
This post summarizes the problems I’ve found in ACM conferences organized by IARES,
the “International Association of Researchers”. Most of their conferences are organized under ACM’s banner, though some editions were organized with IEEE.
In particular, IARES includes two researchers evidently at the center of dubious activity: Shadi Aljawarneh from the Jordan University of Science and Technology,
and Vangipuram Radhakrishna from the Vallurupalli Nageswara Rao Vignana Jyothi Institute of Engineering and Technology.
I summarize the problems below; curious readers can consult the spreadsheet with all details,
including links to PubPeer comments for each problematic paper.
I did not originally intend for this to be a public blog post, as ACM has a form to report issues, but they did not even acknowledge my filling it for a conference full of plagiarism, so here I am.
Let’s first look at ICEMIS, the International Conference on Engineering & MIS.
The conference name does not bother to define MIS.
But the conference website also suggests a hotel in Morocco for a conference in Turkey, so perhaps explanations are not IARES’s strong suit.
Unlike many conferences, ICEMIS doesn’t rotate its program chair: it’s always Shadi Aljawarneh.
The 2015 edition is not that bad: it mostly contains papers that look like actual papers. There are already some cracks in the facade, though: a paper consisting of one half-blank page, a paper in which Radhakrishna cites himself 5 times without using these references in text, and a paper that does the same but not even from Radhakrishna himself. If you think this is poor form, abandon all hope and read on.
The next edition of ICEMIS published by ACM is 2018. After this time skip, problems are more obvious. There are 8 papers in which Aljawarneh or Radhakrishna make up more than 35% of the references. For instance, this one boosted Radhakrishna’s citation count by 26, in a paper with 34 references. More worryingly, there are 1, 2, 3 papers that cite Aljawarneh a few times in completely unrelated contexts, boosting his citation count for no apparent reason. The peer review process also let some interesting oddities through, such as a medical paper that is clearly unrelated to the conference, and another medical paper that is not only unrelated but also contains verbatim plagiarism.
ICEMIS 2019 makes the previous issues look subtle. The front matter contains over thirty citations to Aljawarneh. A paper authored by both Aljawarneh and Radhakrishna cites them 31 and 40 times respectively, despite not using any of its 70 references in the text. If you think that’s bad, wait until you see the one with 77 references, 41 of which are to Radhakrishna, despite the paper being less than one page of main text and not using any of the references either. There are 8 more papers that contain plenty of unused references to Aljawarneh and Radhakrishna. A user manual for a university’s internal IT system even appears in there.
At this point, you may find it hard to believe it could get worse. You would be wrong. ICEMIS 2020 also cites dozens of papers in its front matter, also contains citation vehicles for Shadi Aljawarneh, and for 1, 2, 3 of them authors have publicly denied inserting citations to the program chair. ICEMIS 2021 also has the front matter problem because its front matter editorial is a copy-paste of the 2020 edition, without even changing the dates. It also contains 1, 2, 3 papers that cite the program chair 113 times each, using the same references list every time. Two of these come from authors with the family name Aljawarneh, which is probably a complete coincidence.
Did ICEMIS even happen as a conference? Given that its 2020 edition supposedly happened starting September 14 in Kazakhstan, a date on which the country was under lockdown due to COVID and thus not in any state to host a public international event, one wonders.
IARES organizes another conference: DATA, the ACM International Conference on Data Science, E-learning and Information Systems. Once again, Shadi Aljawarneh is always program chair.
Do not confuse it with DATA, the ACM Workshop on Data Acquisition To Analysis. Easy mistake to make; it’s not like ACM could check their own event acronyms for uniqueness.
DATA follows the same pattern as ICEMIS: the 2018 edition starts as a mild citation vehicle for Radhakrishna, with 5 papers citing him unreasonably, such as this one with 35 Radhakrishna references out of 44. In 2019, it gets worse, starting with the front matter which has the same reference list as ICEMIS 2019’s, and continuing with nearly a dozen papers that cite Aljawarneh in irrelevant contexts, such as this one that incremented his citation count by 18.
After a break, DATA 2021 was back and ready to get some science done… no, sorry, I meant ready to increase Vangipuram Radhakrishna’s h-index, thanks to over a dozen papers citing him dozens of times each. Take this paper, for instance, which cites him 56 times for little effort, since its reference list is nearly the same as the one right before it in the proceedings.
What’s the moral of the story?
Fraud can pay off.
Shadi Aljawarneh has 6082 citations and an h-index of 38 per Google Scholar,
above many well-regarded researchers. This probably helped him sit on the editorial board of PeerJ Computer Science, alongside well-regarded researchers.
There are generally few consequences for research misconduct, as Dorothy Bishop documented.
Fraud at this scale isn’t hard to find if someone is looking. None of the problems I report above need special skill to find. These are not complex scientific issues, they are so basic that even skimming a paper sets off alarm bells. Unfortunately, the sound of these bells apparently didn’t bother big publishers, perhaps because they have few incentives to do anything about it.
It is possible to automate at least some fraud detection! Guillaume Cabanac’s Problematic Paper Screener
finds common issues in papers such as “tortured phrases”, instances in which words have been replaced by synonyms to avoid plagiarism detection software,
often resulting in unintentional hilarity.
The Problematic Paper Screener is what flagged an IARES paper in the first place and led to this work.
There are currently 24 ACM papers flagged by the PPS as suspicious, plus 8 which cite papers themselves flagged as suspicious.
Sleuthing work like this can lead to retractions including for ACM, such as this paper which was flagged as being computer-generated, or this entire conference due to fraudulent peer review, as documented on RetractionWatch.
If the scientific community wishes to be taken seriously, we must set up incentives for publishers such as ACM to do at least basic checks of the texts they publish and the venues they sponsor.
I would like to thank Steve Haroz, Kendra Albert, and Jannik Peters for their feedback, and Guillaume Cabanac for his Problematic Paper Screener and the community he has built around it.