Hi Ember, I suspect you are arguing at a tangent to my argument, and hence misunderstanding the purpose. Broad definitions have value in an appropriate research setting. The way they are being used is however not appropriate.
If Lipkin had unlimited funds, and unlimited resources, I think he would indeed be doing what I suggest, in addition to many other lines of enquiry. When you are talking of good research design, there is an implicit issue of cost efficiency, where cost covers not just money but the other things I mentioned. Its also restrictive in what it can uncover.
As I said before this is a yin-yang argument. We focus on highly selective (reductionist) research because its cost-effective. My argument is though that its not always outcome effective. With enough resources you can do so much more. In time I think this will become the norm as the approach I am suggesting lends itself to extensive automation.
When developed enough it could make it the very best approach for many problems, far more cost effective. Its just that we have to develop the tools, which requires resources. The human genome project was uber-expensive. This will be more expensive than that. That doesn't mean it can't be pursued - we are in fact doing that also. It just means its an expensive road to take, even if it holds more promise for more of us in the long run.
The current use of clustering in proteomics for patients with similar diseases like CFS and post-Lyme are a case in point. If you restrict the patient cohort too much you may lose many of the pathways, stifling research for another generation. If you loosen the definition too much you introduce too much noise and increase the resources required to solve the problem. Its a question of balancing the two criteria. I suspect we will focus on becoming more reductionistic, but if that happens we may well miss a big piece of the answer.
Bye, Alex