How might you account for the prevalence of the misconception that research has to occur step by step?
I blame the way we teach it. Whenever we teach "The Scientific Method" (in all caps like that), we treat it as though it's some holy ritual that must be performed in the same precise way in order to properly appease the gods of science.
For example, here are a bunch of articles showing The Scientific Method as a specific series of steps to follow:
Here's one that at least explains that the scientific method can be approached in a few different ways, but still gives it as a series of steps:
But when you start to get into actual research in the real world as I have, you find that things are a lot messier than that. Often you don't have any particular hypothesis and are just trying to explore. In many (even most) cases you can't actually conduct a controlled experiment, so you need to figure out how to use correlational studies to achieve similar results. Often the first few approaches you try fail to yield anything useful and you need to try something else. The same project can involve multiple layers of exploration, hypothesis testing, and developing new approaches. But all of these things are definitely still a part of good science. One of my favorite t-shirts says, "If we knew what we were doing, we wouldn't call it 'research.'"
Then of course there are the parts of real research that can hinder good science: competition for funding, the pressure for exciting results in order to get published, school politics and fights over prestige, the temptation of p-hacking and confirmation bias.
What we really should be teaching is why we do science this way. We should teach why it's important that we control experiments if we can and control for confounding variables if we can't and how confirmation bias can mislead us all; maybe we should even teach the messiest parts of science, so that we can have a serious public policy discussion about how we might reform our systems of funding and publication in order to remove these perverse incentives for bad science.