
You have already added 0 works in your ORCID record related to the merged Research product.
You have already added 0 works in your ORCID record related to the merged Research product.
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=undefined&type=result"></script>');
-->
</script>
Research Perspectives: Reconsidering the Role of Research Method Guidelines for Interpretive, Mixed Methods, and Design Science Research

doi: 10.17705/1jais.00692
Information systems (IS) scholars have proposed guidelines for interpretive, mixed methods, and design science research in IS. Because many of these guidelines have also been suggested for evaluating what good or rigorous research is, they may be used as a checklist in the review process. In this paper, we raise the question: To what extent do research guidelines for interpretive, mixed methods, and design science research offer evidence that they can be used to evaluate the quality of research. We argue that scholars can use these guidelines to evaluate what good research is if there is compelling evidence that they lead to certain good research outcomes. We use three well-known sets of guidelines as examples and argue that they do not seem to offer evidence that we can use them to evaluate the quality of research. Instead, the “evidence” is often an authority argument, popularity, or examples demonstrating the applicability of the guidelines. If many research method principles we regard as authoritative in IS are largely based on speculation and opinion, we should take these guidelines less seriously in evaluating the quality of research. Our proposal does not render the guidelines useless. If the guidelines cannot offer cause-and-effect evidence for the usefulness of their principles, we propose viewing the guidelines as idealizations for pedagogical purposes, which means that reviewers cannot use these guidelines as checklists to evaluate what good research is. While our examples are from interpretive, mixed methods, and design science research, we urge the IS community to ponder the extent to which other research method guidelines offer evidence that they can be used to evaluate the quality of research.
- University of Vaasa Finland
- University of Jyväskylä Finland
- University of Vassa Finland
- University of Vaasa Finland
- University of Jyväskylä Finland
ta113, Research Guidelines, mixed methods, design science, 380, Design Science, metodologia, research guidelines, tietojärjestelmätiede, Tietojärjestelmätiede, theory of scientific methodology, Information Systems Science, tutkimusmenetelmät, tieteenteoria, interpretive research, Mixed Methods, Interpretive Research, Theory of Scientific Methodology
ta113, Research Guidelines, mixed methods, design science, 380, Design Science, metodologia, research guidelines, tietojärjestelmätiede, Tietojärjestelmätiede, theory of scientific methodology, Information Systems Science, tutkimusmenetelmät, tieteenteoria, interpretive research, Mixed Methods, Interpretive Research, Theory of Scientific Methodology
citations This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).4 popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.Average influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).Average impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.Average
