Yhe CO2 concentration from 1970-2000 increased by 18%. During this timeframe, the temperatures rose. This would give you a forcing of around 0.66 w/m^2 from Carbon Dioxide, assuming a radiative forcing per doubling of CO2 to be 3.7 w/m^2 (which is generally accepted as the value for a doubling of CO2).
Are you going (log(1.18)/log(2))*3.7 W/m² to get 0.66W/m²?
Because when I do that it gives me 0.88 W/m².
But, as you can probably recall, there is a 25-50 year lag between the increase in CO₂ and when only 60% of the warming that it will cause to have occurred. So there is a lot more warming still to come from the change in CO₂ that occurred over that time.
Assuming that this was the only forcing during this timeframe that caused the temperature increase from 1970-2000, it's no wonder how one can get unimaginably high sensitivities.
True, but irrelevant, because this assumption is not made. In those papers looked at by Oreskes, and in many more since, an estimate of the natural forcings are included if the paper can be said to support directly that most of the warming since 1950 is anthropogenic.
With the sun's magnetic field over the last 100 years more than doubling, and with CO2 concentrations increasing by 30% during the 20th Century, this represents substantial forcing from both natural and anthropogenic sources
Is it possible that this claim about the sun's magnetic field is questionable?
The Oreskes study that you cite is not very robust.
I don't know what you mean by that. The abstracts are there for anyone to confirm.