Societal benefits for CBT & GET probably overstated (and important data not shown)
The paper justifies the cost-effectiveness of CBT & GET on Healthcare costs alone. But it also plays up the enhanced cost-effectiveness using Societal costs, which includes costs of informal care as well as healthcare.
Yes, not only is CBT & GET better than SMC alone, it saves society money too. Or maybe it doesn't, depending on the assumptions used.
The major contributor to CBT/CFS Societal savings were from Informal Care, since lost employment was not significantly different between any of the groups.
The authors chose to value this informal care using national mean earnings of £14.60 an hour. That's not real money, it's what the informal carers would earn
if they worked instead of providing care,
and were payed the average national wage.
A more conservative way to value this is by using the minimum legal wage in the UK of £5.93 an hour. The authors looked at this option in their 'sensitivity' analysis and reported that "
it did not have a large impact on cost-effectiveness".
'Not a large impact' isn't a very precise term, particularly when CBT and GET didn't have a large impact on QALY outcomes either. No data is presented, though under 'Limitations' the authors say:
This may not be entirely correct. Table 6 shows the Societal cost savings for CBT & GET relative to SMC, using national mean wages to value savings in informal care. However, as the figures below show, those savings vanish when the informal care is valued at the miminum wage instead:
View attachment 3659
This looks rather different. GET now has a cost of £225 (as opposed to a saving of £472) while CBT is neutral instead of saving nearly £700. So the authors claim that:
doesn't apply to the minimum wage scenario. Similarly, the claim that CBT & GET 'dominate' (i.e. is more effective
and saves money) SMC doesn't apply using the minimum wage scenario. And these figures are hardly consistent with the claim that "results were robust for alternative assumptions".
The authors could easily have mentioned the impact of using the mimimum wage in the paper rather than saying the sensitivity analysis didn't reveal 'large' differences. Or they could have put the data in an online appendix. It doens't look good when unflattering data is hidden, and the value of sensitivity analysis is diminished if the results of it aren't fairly or adequately reported.