The only thing I can imagine is that their criteria were fairly superficial features and they didn't bother to dig/dismissed pesty information that disagreed with their criteria
One of the issues is that they use checklists, and presume nothing major is wrong with the study. Detailed analysis does not occur. Typically a participant in a review has to review thousands of studies in less than a year. That is on the order of three a day, with only a part time effort to do it. Tickbox criteria miss soooo much, especially with such little time. On the other hand many of us, trained in academia, science, math etc., spent hundreds of hours reviewing just a few studies, as in each of us. EBM is a guide, but not nearly infallible. In fact its revealed to be very fallible and capable of being rorted - PACE reveals this.
I am not in favour of EBM for this reason. What I am in favour of is EBP, or evidence based practice. The difference is that doctors are given the skills to evaluate papers themselves, and if something is important they can do the in depth analysis they are not usually trained to do.
The two are complementary however. If doctors have the skill, and more reviewers have the skills, then EBM would not be so easily used as a railroad for bad science. Further if the current impetus for evidence based Management of medicine were not distorting EBM even further, things would be heading on a better path for medicine and the community.
There needs to be a clear separation of scientific EBM and managerial EBM. They clash.