Saturday, March 23, 2013

Proxy viewer with choice of dating and range



In a previous post I showed an active viewer for the Marcott et al proxies, and in the next I described the code. There has been interest in various extensions, to give the option of showing the revised dating of Marcott et al, and also to show a closeup on more recent times.

This is a combined viewer that does those things. The format change is simply that there are a few more entries in the key table, in black at the start. They are
  • "AuthHolo", for Published (author) dating, -50 to 11300 BP
  • "MarcHolo", for Marcott et al dating, -50 to 11300 BP
  • "Auth2000", for Published (author) dating, -50 to 1950 BP
  • "Marc2000", for Marcott et al dating, -50 to 1950 BP
If you click on any of these, it will change to that scale. You have to click again to select a proxy.






Friday, March 22, 2013

February GISS Temp down by 0.11°C


There's a mixture of trends this month. GISS LOTI went from 0.6°C in January to 0.49°C. Satellite measures were well down, but NOAA rose from 0.54°C to 0.58°C. TempLS showed a small decline.

In other news, the equatorial Pacific jet changed from cold to warm in March. That may be transient.

The GISS map for Feb is not yet available. I'll update with a comparison when it is.
Update The map is available - I've compared below.


Update. The GISS site had this note:
"2013-03-21: The update this month was postponed a week while we investigated some bad reports from various stations in Mongolia. NCDC eliminated the reports today."

Here is the GISS map for Feb 2013:


And here, with the same scale and color scheme, is the earlier TempLS map using spherical harmonics



Previous Months

January
December 2012
November
October
September
August
July
June
May
April
March
February
January
December 2011
November
October
September
August 2011

More data and plots



Thursday, March 21, 2013

Code for the spaghetti plot active viewer



I have occasionally posted active viewers of spaghetti plots - most recently the Marcott proxies. There has been interest in how it's done, so I am posting here a fairly generic code. A combination of R and Javascript is required, although I've been able to make the Javascript work from data, so it doesn't have to be specially written for each plot. The work is done in R.


So I'll post the R code and make the Javascript available in a zip file. In R, the spaghetti plot is made, and then a minimal plot with transparent background is made of each component curve, in black. These are png files. The Javascript organises the superposition of them.

Everything you need should be on this zip file. You need to run your data through the makejs() R routine, then display the active.html file.
Update 11.47am 21/3 GMT I fixed something in the zip file

The R program

The key element of the R program is the function makejs(). This takes arguments similar to what you would supply to a spaghetti plot routine. The first, xy, is a list of 2-column coordinate matrices, one for each curve. Then xlim, ylim are the x and y ranges, eg c(0,9). These are all required, as is a vector of labels for each curve, to go in the clickable table.

Then there are inputs with defaults. met is a vector of metadata to be displayed with each visible curve, one string of HTML for each. Default is "", which means no display. Fil is a vector with first component a directory, and second a filename prefix. The directory and prefix will be attached to graphics files, and the directory is passed to the Javascript.

In the program as shown, I then assemble the inputs for the Marcott proxies and run the function. Outputs are the spaghetti plot, black line plots and a file called prox.js, which carries information to the Javascript.

HTML and Javascript

It's best at this stage to run with all files in the same directory, with the images in a subdirectory. The html is called active.html, and you can run it locally. Just click in the file explorer. Files you should have there are:
active.html, prox.js
reddot.gif, X.png, X101.png et seq, and Xkey.png (the table).
X is your file prefix (input to makejs()); images should be in your designated graphics dir.

The Marcott stuff uses prox.sav as data - also on the zip.

The R program code


# This program was written by Nick Stokes, www.moyhu.blogspot.com Mar 21 2013, to process input for a complex plot, and to output the graphics and js files for an active plotter. Describes at: www.moyhu.blogspot.com
# Made available under the terms of the GNU GPL

#The key routine is makesjs(), which can be used generally.
#I am using it here for the Marcott et al proxies, described
#https://moyhu.blogspot.com.au/2013/03/an-active-viewer-for-marcott-et-al.html

makejs=function(xy,xlim,ylim,lab,met="",Fil=c("./","proxy")){
N=length(xy); fil=paste(Fil, collapse="")
# Working out the layout for the key
ii=round(sqrt(N/2));jj=N%/%ii+1;
iw=round(max(nchar(lab))*7.5);nw=iw*ii;nh=jj*18;
# Assembling data for javascript
b=sprintf('Yxdir=["%s","%s"]; YxT=[%s,%s,%s,%s,%s];',
Fil[1],fil,N,jj,ii,iw,18)
b=c(b,paste("Yxmet=['",paste(met,collapse="','"),"'];",sep=""))
write(b,file="prox.js")

# The spaghetti plot
cls=rainbow(N,v=0.99,end=0.9)
png(sprintf("%s.png",fil),width=wid)
plot(xlim,ylim,type="n",xlab="Years BP",ylab="proxy anomaly deg C")
for(i in 1:N){
lines(xy[[i]],col=cls[i])
}
dev.off()
# The black line plots
for(i in 1:N){
png(sprintf("%s%s.png",fil,100+i),width=wid)
par(bg="#ffffff00") # The transparent background
plot(xlim,ylim,type="n",axes=F,xlab="",ylab="")
lines(xy[[i]],lwd=2)
dev.off()
}
cls=rainbow(N,v=0.7,end=0.9) # darken the colors
# Now make the key - labels lab[]
png(sprintf("%skey.png",fil),width=nw,height=nh)
par(mai=c(0,0,0,0))
plot(c(0,nw),c(0,nh),type="n",xlim=c(0,nw),ylim=c(0,nh),xlab="",ylab="",axes=F,xaxs="i",yaxs="i")
for(i in 1:ii-1)for(j in 1:jj-1){
k=i*jj+j;
if(k<N) text(i*iw,nh-j*18-9,lab=lab[k+1], col=cls[k+1],pos=4);
}
dev.off()
} # End function makejs

## End of generic code. The rest is the Marcott example

# Make anomalies about tm period
anomalize=function(tm,xy){
for(i in 1:length(xy)){
p=xy[[i]]
y=approx(p[,1],p[,2],xout=tm)
xy[[i]][,2]=p[,2]-mean(y$y)
}
xy
}

load("prox.sav") # loading data from Marcott et el SM
wid=800
graphics.off()
cls=rainbow(73,v=1,end=0.9)
xlim=c(-50,11300); ylim=c(-5,5);
prox=anomalize(4500:5500,prox)
# Input xlim,ylim,xy,proxst
# xy is list(N) of nx2 matrices, one for each proxy (variable n)
# proxst is a Nx2 string - labels and metadata
lab=c("GeoB5844-2","ODP-1019D","SO136-GC11","JR51GC-35","ME005A-43JC","MD95-2043","M39-008","MD95-2011","ODP 984","GeoB 7702-3","Moose Lake","ODP 658C","MD95-2011;HM79-4","IOW225517","IOW225514","M25/4-KL11","ODP 1084B","AD91-17","74KL","74KL","NIOP-905","NIOP-905","MD01-2421; KR02-06","GeoB 3910","Dome C","GeoB 7139-2","Dome F","18287-3","GeoB 1023-5","GeoB 5901-2","KY07-04-01","Hanging Lake","GeoB 3313-1","Lake 850","Lake Nujulla","PL07-39PC","MD02-2529","MD98-2165","MD79-257","BJ8 13GGC","BJ8 70GGC","MD95-2015","Homestead Scarp","Mount Honey","GeoB 10038-4","TN05-17","MD97-2120","MD97-2121","17940","Vostok","D13822","M35003-4","OCE326-GGC26","OCE326-GGC30", "CH07-98-GGC19","GIK23258-2","GeoB 6518-1","Flarken Lake", "Tsuolbmajavri Lake","MD01-2390","EDML","MD98-2176","MD98-2181","A7" ,"RAPID-12-1K","NP04-KH3, -KH4","Agassiz & Renland","GeoB6518-1","MD03-2707","GeoB 3129","GeoB 4905","MD01-2378","MD02-2575");
lab=paste(1:73,lab)
# Make metadata strings
met=lab
s=c(" Proxy #","<br>Type: ","<br>Calibration: ","<br>Lat=",", Lon=",", Alt=",
"<br>Resolution = ","<br>Season: ","<br>Ref: ")
s=paste('<span style="color:blue">',s,'</span>',sep="")
for(i in 1:73){
p=meta[i,]
met[i]=sprintf("%s%s %s%s%s%s%s%s%s%s%s%s%s%s%s%s%s%s%s",
s[1],p[1],p[2],s[2],p[3],s[3],p[4],s[4],p[5],s[5],p[6],s[6],p[7],s[7],p[8],s[8],p[9],s[9],p[10])
}
# Now run makejs()
makejs(prox,xlim,ylim,lab=lab,met=met,Fil=c("png/","prox"));

Tuesday, March 19, 2013

An active viewer for Marcott et al proxies



In previous  posts I have been going through a simplified emulation of the Marcott et al Holocene temperature reconstruction. I think it will be helpful to provide an active Javascript viewer of the individual proxy data given with that paper.

I made a spaghetti plot of the linearly interpolated proxy temperature anomalies (base 4500-5500 BP) against published dates (not modified). There is a table that you can click on to make each one show up as a black curve. At the same time, the metadata from the Marcott et al SM is shown, and there is a map which shows where the proxy is from.

Update. I've made an extended viewer which lets you look at the effect of dating change, and to see a plot of just the last 2000 years.

Update. At Climate Audit, it was suggested that it would be good to be able to black two (or more) curves at once. That is easy, but it is not so easy to make clear which is which. So I've added a swap button. To compare two curves, select one, then the other, and then click swap as often as you want. The last two curves will alternate.










Sunday, March 17, 2013

Next stage of Marcott et al study



In my last post, I described a rather primitive emulation of a long period proxy temperature reconstruction by Marcott et al, going back to 11300 BP. I tried first using the published dates, and got a fairly unsurprising result, similar to the paper, except that it lacked a recent spike visible there. Then I tried using the modified dates of Marcott et al, and got s generally similar result, except for a very large initial spike.

I have found a rather trivial reason for this, but I'm not sure what to make of it. At least one proxy sheet in the spreadsheet, proxy 65 RAPID-12-1K, had some junk in a block of numbers well down from the main data block. These are in the data columns E:G, which are, respectively, published date, temperature, and Marcott date. Update, 64 and 68 have similar junk.

My program read those as data, and for some proxies it considerably affected the interpolation. The reason for the difference is that for published dates, the extra numbers seemed to be dates beyond 11300, which did no harm. But in col G, the Marcott dates, they were small numbers, and interpreted as early years.

When I fixed this, and read only numbers in the consecutive data block, the numbers were much more reasonable, although a spike remained.

I don't know if a corresponding issue may have affected the paper. I would doubt it, although it would explain the discrepancy between the thesis version and the Science paper.

Update I have posted an active viewer for the individual proxies.

Update - I have made some minor changes to the plots:
1. I changed the anomaly base from 5800-6200 BP to 4500-5500. The period I initially used was from Fig S26, but the latter is the right one. I don't think it made a perceptible difference
2. Romanm at CA noted some spurious zeroes in proxy 62. These caused the earlier dip at about 9000.
3. I improved the y-axis alignment of the combined plots.

Details below.

Here are the plots. Firstly the new plot with published dates, which is not much changed:



Then the plot with Marcott revised dates. This has a still considerable spike, but less than before:



And here are the two orange curves solo:


Discussion


The new spike is less than before, and less than for Marcott et al. But it still represents a change caused by dating, and I'm looking into the reasons. I think it may be particular proxies; theree are two main contributions.

One is from, 44: Mount Honey on Campbell Island, has the early part of the date range rather compressed - about 30yr intervals go to about 10 years. The temperatures themselves are unremarkable, but the trends increase.

The other comes from two proxies which have large negative anomalies, but which disappear from the early years on redating. These are 54: OCE326-GGC30 and 58: Flarken Lake. In both cases, re-dating moves the date back 100 and 150 years respectively. In both cases there had been substantial cooling over many centuries.

Code

I won't post a new version yet. There are no method changes; just messing around to get a contiguous block of numbers, and some convenience stuff (which is useful) connected with the two date cases.

Saturday, March 16, 2013

My limited emulation of a Marcott et al reconstruction



There has been much discussion of a paper by Marcott et al, which is a proxy temperature reconstruction of low resolution proxies going back to 11300 BP. They are distinguished by not requiring calibration relative to measured temperatures.

There has been discussion in make places, but specially at CA and WUWT, where it isn't popular. Steve McIntyre at CA has done a similar emulation.

The complaints are mainly about some recent spikes. My main criticism of the paper so far is that they do plot data in recent times with few proxies to back it. It shows a "hockey stick" which naturally causes excitement. I think they shouldn't do this - the effect is fragile, and unnecessary. Over the last century we have good thermometer records and don't need proxy reinforcement. Indeed the spikes shown are not in accord with the thermometer record, and I doubt if anyone thinks they are real.

Anyway, I thought I would try for a highly simplified recon to see if the main effects could be emulated. R code is provided.

Update I have posted an active viewer for the individual proxies.

Update - I've done another recon using the revised Marcott dating - see below. It makes a big difference at the recent end, but not much elsewhere. 

Further update - I've found why the big spike is introduced. See next post.

Data

The data is an XL spreadsheet. It is in a modern format which the R utility doesn't seem to read, so I saved it, read it in to OpenOffice, and saved it in an older format. There are 73 sheets for the various proxies, over varying time intervals. The paper and SM have details. I'm just using the "published dates" and temperatures. The second sheet has the lat/lon information.

Marcott et al do all kinds of wizardry (I don't know if they are wizards); I'm going to cut lots of corners. Their basic "stack" is a 5x5 degree grid; I could probably start without the weighting, but I've done it. But then I just linearly interpolate the data to put it onto regular 20-year intervals strating at 1940 (going back). They also use this spacing.

Weighting

The trickiest step is weighting. I have to count how many proxies are in each cell at any one time, and down-weight where there are more than one per cell. The Agassiz/Renland was a special issue with two locations; I just used the first lat/lon.

After that, the recon is just the weighted average.

Update - I should note that I am not using strict area weighting, in that I don't account for the reduced area of high latitude cells. I thought it was unreasonable to do so because of the sparsity. An Arctic proxy alone in a cell is likely to be surrounded by empty cells, as are proxies elsewhere, and so downweighting it by its cell area just has the effect of downweighting high latitudes for no good reason. I only downweighted where multiple proxies share a cell. That is slightly favorable to Arctic, since theoretically they can be closer without triggering this downweight. However, in practice that is minor, whereas the alternative of strict area weighting is a real issue.
 


Graphics

I made a png file with a transparent background, and pasted it over the Marcott Fig 1B, which is the 5x5 recon over 11300 years. There was a lot of trial and error to get them to match. I kept both axes and labels so I could check alignment.

I'll probably add some more plots at shorter scales. The fiddly alignment takes time.

The plot

Mine is the orange curve overwriting their central curve and blue CI's.



Discussion

The most noticeable think is that I get a late rise but not a spike. Again I'm not doing the re-dating etc that they do.

Another interesting effect is that the LIA dips deeper earlier, at about 500 BP, and then gradually recovers, while M et al get the minimum late.

Overall, there is much less smoothing, which is probably undesirable, given the dating uncertainty of the proxies. There is a big dip just after 10k BP, and a few smaller ones.

Aside from that, it tracks pretty well.

Re-dating

Update:
I ran the code again using the revised dating put forward by Marcott et al. It's mostly Marine09, but not always. It doesn't make much difference to the curve overall, but does make a big difference at the recent end. Here is the revised overlay plot:



And here are the orange plots shown separately, published dates left, revised dates right:



I'll look into the difference and post again.



The code

There is a zip file with this code, the spreadsheet in suitable format, and other input files here. I've done some minor editing for format to the spreadsheet, which may help.
Update - Carrick reported that Mac users can't use the xlsRead utility. I've modified the code so that it immediately writes the input to a R binary file, which I have put (prox.sav) in the zip file. So if you have this in the directory,  you can just set getxls=F below.


#setwd("C:/mine/blog/bloggers/Id")
# A program written by Nick Stokes www.moyhu.blogspot.com, to emulate a 5x5 recon from
#A Reconstruction of Regional and Global Temperature for the Past 11,300 Years
# Shaun A. Marcott, Jeremy D. Shakun, Peter U. Clark, and Alan C. Mix
#Science 8 March 2013: 339 (6124), 1198-1201. [DOI:10.1126/science.1228026]
#https://www.sciencemag.org/content/339/6124/1198.abstract

# Need this package:
getxls=F # set to T if you need to read the XL file directly
if(getxls) library("xlsReadWrite")
# The spreadsheet is at this URL "https://www.sciencemag.org/content/suppl/2013/03/07/339.6124.1198.DC1/Marcott.SM.database.S1.xlsx"
# Unfortunately it can't be downloaded and used directly because the R package can't read the format. You need to open it in Excel or OpenOffice and save in an older format.

loc="Marcott.SM.database.S1.xls"
prox=list()
# Read the lat/lon data
if(getxls){
lat=read.xls(loc,sheet=2,from=2,naStrings="NaN",colClasses="numeric")[2:77,5:6]
} else load("prox.sav") # If the XL file has already been read
o=is.na(lat[,1])
lat=as.matrix(lat[!o,])
w=rep(0,36*72)
wu=1:73; k=1;
for(i in 1:73){
j=lat[i,]%/%5;
j=j[1]+j[2]*36+1297;
if(w[j]==0){w[j]=k;k=k+1;}
wu[i]=w[j];
}
# Read and linearly interpolate data in 20 yr intervals (from 1940)
m=11300/20
tm=1:m*20-10
g=list()
h=NULL
if(getxls){
for(i in 1:73){
prox[[i]]=read.xls(loc,sheet=i+4,from=2,naStrings="NaN",colClasses="numeric")[,5:6]
}
save(prox,lat,file="prox.sav")
}

for(i in 1:73){
u=prox[[i]]
g[[i]]=approx(u[[1]],u[[2]],xout=tm) # R's linear int
y=g[[i]]$y
y=y-mean(y[-25:25+250]) # anomaly to 4500-5500 BR # period changed from 5800-6200
h=cbind(h,y)
}
r=NULL # Will be the recon
# now the tricky 5x5 weighting - different for each time step
for(i in 1:nrow(h)){
o=!is.na(h[i,]) # marks the data this year
g=h[i,o];
u=wu[o]; n=match(u,u); a=table(n); d=as.numeric(names(a));
e=u;e[d]=a;wt=1/e[n]; #wt=weights
r=c(r,sum(g*wt)/sum(wt))
}
# Now standardize to 1961-90, but via using M et al value = -0.04 at 1950 BP
r=r-r[98]-0.04
# now tricky graphics. my.png is the recon. ma.png is cut from fig 1b of the paper. You have to cut it exactly as shown to get the alignment right
png("myrecon.png",width=450,height=370)
par(bg="#ffffff00") # The transparent background
par(mar=c(5.5,5,2.5,0.5))  # changed to better align y-axis
plot(tm,r,type="l",xlim=c(11300,0),ylim=c(-1.2,0.8),col="#ff4400",lwd=3,yaxp=c(-1.2,0.8,5))
dev.off()
# Now an ImageMagick command to superpose the pngs
# You can skip this if you don't have IM (but you should)
system("composite myrecon.png ma.png recons.png")

Saturday, March 9, 2013

TempLS global temp down 0.02°C in February


The TempLS monthly anomaly for February 2013 was 0.417°C, down slightly from 0.433° in January. I had earlier calculated a rather larger drop, but more incoming data have reduced that. It's still rather early, but there are 3832 data points, which is quite a good number.

This contrasts with very large drops in the satellite indices. But again, they had risen much more in the previous month.

Here is the spherical harmonics plot of the temperature distribution:



And here is the map of stations reporting: