Roundtable discussion: International Workshop on a Very Large Volume Neutrino Telescope for the Mediterranean Sea 2009

Roundtable discussion: International Workshop on a Very Large Volume Neutrino Telescope for the Mediterranean Sea 2009

Nuclear Instruments and Methods in Physics Research A 626-627 (2011) S72–S83 Contents lists available at ScienceDirect Nuclear Instruments and Metho...

1MB Sizes 0 Downloads 71 Views

Nuclear Instruments and Methods in Physics Research A 626-627 (2011) S72–S83

Contents lists available at ScienceDirect

Nuclear Instruments and Methods in Physics Research A journal homepage: www.elsevier.com/locate/nima

Discussion

Roundtable discussion: International Workshop on a Very Large Volume Neutrino Telescope for the Mediterranean Sea 2009$ A. Bettini a,b, U. Katz c, P. Lipari d, C. Spiering e, A. Watson f, C. Touramanis g a

Padua University, Italy Canfranc Underground Laboratory, Spain c University of Erlangen-Nuremberg, Germany d Sapienza University of Rome, Italy e DESY, Zeuthen, Germany f University of Leeds, UK g University of Liverpool, UK b

Touramanis: I am Christos Touramanis from Liverpool University and as most of you know I have been a member of the Design Study phase since the beginning, and I am the moderator for this discussion. The way the organisers and I wanted to organise this Round Table discussion was, instead of having a large number of experts on the panel that have been working in this field for many years, which would have led to a closed discussion amongst the panel, to have a small number of members that would provide us with guidance and starting points for a discussion. This will give a chance for all in the audience to participate, so everybody should feel free to speak up and contribute to the discussion. You can see from the names of the panel, starting from the top, we have three members who represent the experimental and theoretical community, and I think they have all given very nice presentations here yesterday and today, and will be able to point us to the right direction for every question. Then we have two eminent personalities as external experts because, as we now clearly see the project is moving on and we see in the timeline in one to three years important decisions and approvals by funding agencies, it may not be enough just to have our own technical discussions on how to move along, but it would be nice to get some input of how the community in general perceives what we are doing and what we propose. Even if we come to a convergence, and even if we can find the best solution to everything, then if the rest of the community (i.e. the rest of the Astroparticle, underground laboratories physics, undersea science, and even the more general particle physics communities) are not aware and accepting of the way we are doing things then we will still have to revisit our project and sort things out at a later time.

$ Transcribed by the editors from the video recording of this session. Correspondence should be addressed to the editors at [email protected]

0168-9002/$ - see front matter & 2010 Elsevier B.V. All rights reserved. doi:10.1016/j.nima.2010.09.014

So if you look at my transparency (Fig. 1), there are a couple of points at the beginning that I call physics case—motivationoutreach, and reality check, and we can discuss them quickly. I would like to ask the panel members, to give their opinion in a couple of minutes each on those items and then we will move on to the second part of the list which contains the actual decision items: how to select the site, the configuration, the optical module, and the technology. Also, lets try to leave, as much time as we can, at the end for open discussion. So, for the first bullet point, we hear in general discussions comments like: ‘‘As long as IceCube hasn’t found anything, then why are you sure you want to do it? Are you sure you want to do it?’’ We know we are sure about it, but clearly the problem is how do we convey this, and how do we persuade the community and the people that matter that we do. So, actually, do we have to sharpen the focus of our plans and our objectives in science? Do we have to produce a sharp and clear set of messages so that this can be conveyed easily outside our community? Do we need to strengthen our arguments in order to win everybody? Then there are other questions such as: ‘‘You can do so much to understand the universe in X-rays, ultra high energy cosmic rays, you have all these fantastic new tools and you see results. Then why do you need neutrinos?’’ There is a parallel here with neutrinos themselves and particle physics in the 60’s and 70’s when most of the HEP community was saying ‘‘You have all these other particles that are easy to detect; why do you bother with neutrinos?’’ and we see that in the last ten years neutrinos are in the centre of the international programs. So, I would like to start from the non-Neutrino Telescope people, maybe with Alan (Watson) and then Sandro (Bettini), to give us short comments on this. Watson: To me it is an incredibly exiting field and obviously a very ambitious program. I thought, Uli’s (Katz) presentation was fantastic in a way it stressed the challenges. The field really fires my imagination and I don’t think at this stage I can add anything very useful to the motivation and the outreach. I would like a chance to

S73

A. Bettini et al. / Nuclear Instruments and Methods in Physics Research A 626-627 (2011) S72–S83

Fig. 1. Outline.

say something about the scale, the sites and maybe the cost in comparison with the AUGER experience but that will be later on. Bettini: Yes I agree. It’s a fascinating project but the scientific program must be better focused. You have to convince, as you said, the community at large. You will be competing with the rest of the community for money, presumably in the next decade or so. If we are lucky the funding for particle physics in general will not increase so you will be competing with the LHC, with dark matter searches (which will not be very expensive), with beauty factories, etc. So you will have to convince first the communities, and then the agencies that these 250 M h are worthwhile. Touramanis: Thank you. I think Paolo (Lipari) yesterday put some very nice questions. As you may have realised I used some of them today. Please give us your perspectives. Lipari: I am replacing Werner Hoffman in some sense, who was invited but cannot be here today. I was talking with Luciano (Moscoso) this morning and it was interesting for me to think about the history of Cherenkov detection of TeV photons. TeV photons were searched for decades. People have been trying to see the range of energy with Cherenkov light since the early 60’s, maybe even before, and the first detection happened in 1989. For all these years many people thought they were losing time and money because they were doing something which was very difficult and they were not succeeding. They were having hints that were actually false. Finally in 1989, I remember since I was at the meeting, when they arrived and said; ‘‘This is the Crab nebula at 9 sigma’’. It was extraordinary and people were very excited. They had the right idea; what allowed them to see the Cherenkov photons was constructing of a 10 m telescope with about 50 PMTs; that is the idea of imaging. And then you saw the Crab – this was a fantastic moment – and now, 20 years later, we have 70 sources and a lot of astrophysics. You have beautiful images and fantastic science is being done. So nobody in the whole community needs to be convinced anymore. And so, now in the neutrino field, what you are trying to do is very difficult and I think you are in a pre-Whipple, pre-Crab result era. You have to find and create a design that shows what you are allowed to do and then you have actually to improve it. I think you are at the stage where you have to find the correct design and demonstrate it. A comment: it is really a crucial moment and what you are trying to do is not impossible. It is difficult, you have to demonstrate it is doable and the future will arrive. People understand I think that all of this is very interesting. They say, I think, this is too difficult, this is too costly, so you have to do this first step, etc. But I think once you start then things will come. Concerning the question of the competition, this is actually a delicate question and I think that it should be addressed seriously.

I don’t think the competition is with dark matter or LHC. The competition is with Gamma Astronomy. I am not in favour of one or the other, none of us is really in favour of one or the other. We want to understand the universe; we want to understand the high energy universe. What is the best way to do it? There is no doubt that certain things can been done only with neutrinos. Many other things cannot be done with neutrinos. Actually in a zero-sum game when you have only 200 Mh and you have to spend them, I don’t know if I will spend it on a neutrino telescope in some sense because you can do other things. But, I think, you have first of all to get to the ‘‘Whipple’’ stage. You have to demonstrate that you can start to do things and then the community must find the right equilibrium between these two different things. All of us want to do everything in some sense and want to do things in the best way to understand the universe. When things start to develop, people can be convinced on how to do it. Touramanis: Thank you very much, that was quite useful. I guess we can go to Christian (Spiering). Spiering: I want to say something about the question of the competition. As you know we have written a Roadmap for Astroparticle Physics in Europe in which we presented a funding increase of a factor of two. I agree with Alessandro (Bettini) that this is an extremely optimistic assumption. It looks more like a slight increase; maybe it will stay constant over the next years. It will much depend on whether there will be discoveries. By discoveries I don’t mean the next dozen sources with HESS and Magic, but something which IceCube will see or something which AUGER will see. We have been trying – for our own exercises – what will be the worse case scenario. We said OK let’s cut a little bit the money for CTA (we say to them ‘‘If you build it small it’s a pity, but it will still be good and do clear physics’’) and there are other things you could cut down. It was only for two experiments where we said we cannot cut them down: either you build them big or you don’t build them. These are AUGER and KM3NeT. And so it turned out in the end, if you make the same figure once more – which most of you know from the Roadmap – there is one big yellow chunk, and if you take that off then everything is nearly flat. This yellow chunk is KM3NeT. So there is a critical situation here, because there are many other people who say ‘‘We have nearly guaranteed discoveries’’. When I discussed it with Uli (Katz) he said ‘‘What is the discovery? Is 100 sources of HESS it or is it just a little new piece of information?’’ They have positive results which are easy to present to the community which are understood as being fantastic even by our colleagues who are dealing only with accelerators. So we have to make a very clear case and we have to show that we are in the right ball park, even after IceCube has not yet seen something. Touramanis: Thank you very much; I think that was very clear. I think Uli (Katz) can conclude this round. Katz: Of course I am not quite unbiased in that. I very much agree with Paolo (Lipari): we are in a stage where neutrino astronomy as a tool to understand the universe has to be proven. The technology as such has been proven by IceCube and ANTARES and we have to set up something that maximises the probability that we will produce more than upper limits. Personally I think the potential in neutrinos is very worthwhile pursuing. We are in a situation where there is no guaranteed source, in the sense that we can really be sure to see it. Anything that we would see would be a big discovery. I think we have to maximise the chances for such a big discovery. And this was a statement that I wanted to make. Anyway, neutrino telescopes despite their name have a range of physics objectives and what we’ve learned is that these physics objectives are mutually exclusive if you optimise. Not in the sense that you cannot do anything on one if you do everything you can do on the other, but in the sense that you end up with different

A. Bettini et al. / Nuclear Instruments and Methods in Physics Research A 626-627 (2011) S72–S83

detectors if you optimise for example for dark matter, or for point sources, or for diffuse flux, etc. I think we have to focus on the discovery that is most realistic and that will probably also give us most information on the universe. Which, in the current stage, in my view it is point sources. In a few years from now we might have learned a lot that changes the picture we may reconsider this—it is technically possible. But for the time being, I strongly support us to go for one priority and have second priorities without impact on the design. Touramanis: Speaking of priorities my second bullet point on which we might be able to spend even less time is the ‘‘reality check’’. We want a project to be approved soon, and then to be built fast so that we can have physics coming out sometime in the second half of the next decade, to be sort of realistic and not very optimistic, but definitely we don’t want to still be discussing ten years from now the same thing. So, now is the time to get everything agreed, fixed, written down, so that we can start going (the sooner the better) to our National Funding Agencies, to the International Funding Agencies ,to Brussels, maybe seek outside collaborators, in order to actually have this thing approved and to get some wind behind us and start moving. So the question is: do we have to converge now, or does anybody think that we have the luxury of debating for another 3–4–5 years before reaching decisions? The other item, that for me is quite evident, but I would like to hear other views on is: Given the physics priorities, do we go primarily for point sources, for example, or not? And I agree we should probably do that at least for phase one. The physics we want to do, the selection of site configuration, the technologies, they all – or maybe some of them – are interrelated, which means that not only we have to converge in the site but we have to try to make decisions on all aspects, in a given sequence or some of them in parallel. We can afford to say that, we could in isolation discuss the site, and then in isolation discuss the technology. Uli (Katz) is nodding, so Uli? Katz: This is an extremely difficult question. Let me start with the correlation between technical decisions and site. Clearly the sites differ at least in two respects which are their depth and their proximity to shore. This has an immediate impact on the design and I think once we really fix the design in terms of a proposal, or a technical detailed description, then the site has to be known otherwise it cannot be done. On the other hand, a lot of the studies that are going now are only mildly site-dependent because, for example, depth only impacts on those components that are pressure dependent. These are of a limited number of components and we know to what depths they qualify so I think we can continue reasonable work without having a site decision immediately, but we must have a site decision when we really fix things and go to the engineering phase. I think the configuration refers to what I call ‘‘the footprint’’. That means the distances in the detector and the layout of the detector, which are of course intimately related to the type of physics you want to optimise to. This is in my view is less sitedependent, but rather it is time-dependent, in the sense that we should be able to react to developments in science. It might well be that there are new questions coming up depending on what LHC finds. It might well be that there are new questions or hints coming up if IceCube makes a discovery. So we have to be able to react to such developments in due time and that means, that in my view, configuration in terms of for example distances maybe fixed later. Not very much later but a little later. It has to be of course fixed when we purchase the components. Now, all this process involves both a scientific–technical aspect which we, the consortium, are here to cover, but is also intimately related, of course, to the funding situation, and the sites are intimately related to the funding situation. Therefore the matrix of correlation is even worse than it is indicated here and I think

S74

this is probably something that will come up later, but we may not neglect this correlation. Touramanis: Thank you for that; now let’s go in reverse order to last time. Maybe Christian has some comments from their experience. Spiering: I just want to support the opinion that point sources (with point sources I mean particularly Galactic point sources) should be in the focus. This gives me the possibility to address the question ‘‘How good are the limits for extra galactic point sources which we have from IceCube diffuse limits? What is our expectation for the number of sources which we have from IceCube diffuse limits following Paolo’s (Lipari) arguments?’’ I should again remind there are some assumptions, such as isotropically distributed, more or less fixed neutrino luminosity for all sources so they do not differ by orders of magnitude, dN/dE  E-2 spectrum, Euclidean universe, a lot of thingsy If we take the present diffuse limit which we have from AMANDA (IceCube has not yet published the diffuse limits) then the number of extra galactic sources which we could expect for IceCube is between 1 and 10. So now let us assume that we improve our limit by a factor of 10 and though in diffuse and that would be a limit and not at observation or a hint even of an observation, then we would go down for the number of extra galactic sources expected to be smaller than one Again always under the assumptions – Paolo please correct me if I interpret you wrongly – which I stated at the beginning. All this of course is not true for galactic sources, because they are obviously not isotropic, etc. (Note: In support of this argument Prof. Spiering had prepared five transparencies, which were included in the conference web site but were not shown during the discussion. They are in the appendix to this transcript) Touramanis: Thank you for that. I don’t know if Paolo (Lipari) has something to add. Lipari: Let me make a comment. I have really a great admiration for Christian (Spiering), and Leo (Resvanis), and Tonino (Capone), the people who are thinking of really building a detector. I think this is extraordinary difficult and I feel sometimes useless doing a calculation when these people are doing the things. And of course doing a thing requires a lot of dealing with the real world. I think you guys should wait for a few years... The next three years will be very interesting I think there are at least two things you should look out for (1) the IceCube results and (2) Dark Matter results, a very completely different subject which I think could reveal something quite interesting. I think the PAMELA result is astrophysics, but maybe not. For the LHC maybe you should wait a little bit too long. On the one hand there could be new information in these results, but on the other hand you cannot just stop. So do your best but keep your eyes open. On the other issue of site (this goes back to the comment of Christian about the state of funding and so on): If the community that wants to do things in the Northern Hemisphere does not appear focused, unified, it is not going to get this money, because you will appear not serious to the rest of world. So maybe the mutli-site is the right idea, but it sounds bad to a lot of people. I don’t like to be reticent in these things. I said I admire people who can build things, but you want to build things for a deep purpose. You can’t work just to develop your laboratory because this will be wrong. You want to have an ambitious physics goal and pursue it. I think this is really indispensable because if not, actually, you risk also not achieving anything and again, you have to be focused and unified in this. IceCube was built because they were capable of showing that their community wanted to do it and they were focused. They should get a lot of merit for that. You should do something comparable in order to be able to do what you want.

S75

A. Bettini et al. / Nuclear Instruments and Methods in Physics Research A 626-627 (2011) S72–S83

Touramanis: Thank you very much. I cannot agree more with all of that but I see that Uli wants to make a comment. Katz: Just one sentence, I mean, maybe I was not quite clear. All you say is exactly the background of my statement. You want to stay tuned to physics developments as long as possible, within a time scale that is given by the boundary conditions. Touramanis: Thank you for that; it is quite clear that you have to be led by the science. If you are not led by the science you will end up in a dead end. If you are led by science but you cannot demonstrate it, you will have a lot of difficulties. Maybe Alessandro (Bettini) has some comment. Bettini: Yes, I agree too, of course. There are two points: 1) There is the physics scope: I am not an expert of the field but I think that the point source is really in first approximation the main objective. I have very severe doubts on dark matter. I don’t see really how it can contribute. It looks to be very marginal not withstanding that one’s eyes should be kept open. Even a signal as big as one that is expected, as that of PAMELA, is clearly due to astrophysics and not to dark matter. If it was dark matter that would be so much out of any conceivable values of cross-section or density in the heaven. It is clearly to pulsars or some other astrophysical sources. So there I would be careful and I should not wait for LHC results. LHC results might take several years before becoming of value and the time to decide this is now. If you wait too long the thing will die. 2) Convergence: We organised a workshop in Taormina 12 years ago, and an extreme amount of work has been done on the project. The progress was enormous. But still convergence is needed. Many of the things that are done in NESTOR, NEMO and ANTARES are useful for R&D, but none of them are useful for physics, because they are by order of magnitude too small. Maybe it would be advisable to converge in work mainly on the R&D that is still needed for the cubic kilometre or the 10 cubic kilometres. Instead part of the things that are being done now probably can be put in the background. I was looking at some proposals by a number of European groups and some of these were not clearly committed to the KM3NeT. So maybe there is something to do to achieve converge. On the multi-site option: I hope the multi-site option is not there, because it does look to me as a decision not to decide. Touramanis: Thank you very much. Christian (Spiering) wants to make an intervention. Spiering: It’s just a reply; I just want to say I am not so sceptical about the dark matter like you. Actually for spin dependent crosssections we are two orders of magnitude better than direct experiments. But what I want to say is that this has been mostly done already by IceCube and the Deep Core. On the other hand you could buy into it with a deep core like nested array which will not cost so much and then you can cross check what IceCube has seen or not seen. So I just want to say I am not so sceptical. On the other hand I reiterate that there should be one dominant goal. Columbus would not have done his trip if he had gone to the King to ask to have a ship for going here and there, and a little bit more there. NO! He wanted to go to India; and that is the only way you can sell such things. Touramanis: OK, so I think we agree on that. We will go to Alan (Watson) for practical reasons since he has to be at the Airport at some given time. Watson: First thing I would say is it really important not to lose momentum. It’s clear from Uli’s (Katz) presentation and all I have heard about the KM3 design study that there is tremendous momentum across Europe on this project.

However I am bound to say that I found the time scales that Uli presented very optimistic, and I am comparing this with what happened with AUGER which is a 50 million dollar capital project where the environment of the pampas of Argentina was certainly a much more benign place than the deep Mediterranean. It could very well be that the prototyping will take longer than is thought and during that period the whole issue of results and point sources will become clearer. I would like to make really three points:

1) First of all make sure that you build it very large. I think the experience of AUGER is perhaps worth bearing in mind. When we decided to go for 3000 km2 there was the plan to build it at the same time in the Northern and the South at about that size. When Jim Cronin and I were touring the world trying to get collaborators, the question that came up most often was ‘‘Why do you want to make it so large?’’ And if you heard my talk yesterday, the message is that we made it too small. The rate of the really interesting events of energies above 5  1019 TeV is two per month! If you think of trying to get the spectrum for a particular source you probably need a minimum of 100 events. I am clearly not going to live long enough to see that with the size of AUGER South. The plan is to build AUGER North seven times the size; seven is not a magic figure, it is so because of the land that is available in South Eastern Colorado and also of the cap in dollars. So for me you should be thinking now of the more expensive option. I suspect that to build something smaller could in the end, unfortunately, be a bit of a waste of money. 2) Now for the issue of sites—I have to say that when I first heard of the three site option sometime ago I really couldn’t believe my ears. For the AUGER collaboration we did a worldwide site survey. I want to talk about the Northern hemisphere, but in the South there is in a sense a mirror situation because we had three possible sites in Australia, in South Africa, and in Argentina. These are the only places at suitable latitude in the Southern hemisphere that you can get 3000 km2 at the right altitude. We decided without any thought whatsoever that we would be choosing a single site. It didn’t cross our minds for a second that we would go for three sites. I am sure if anybody had suggested this to Jim Cronin and myself they would have been squashed and probably kicked out of the collaboration. I have several reasons for saying this because I have been reflecting at this meeting as to what might have happened if we had gone for three sites. Certainly the three sites would have proceeded at different rates for all sorts of logistical reasons that I will not try to go into. I am sure that some sort of unhealthy competition would have developed. Competition can be very healthy, but it can also be quite destructive. I imagine the funding will be partly European and partly from the countries involved. I think it will be very destructive for the European dimension to try to pursue three sites. You would also require three infrastructures, which seems to me expensive. I don’t understand neutrino detectors as well as I understand cosmic ray detectors, but I think there will be problems with edge effects. Volume is very good if you trying to isolate sources, as in this deep core work of Ice Cube. Surely there will be loss of economies of scale. There will be needs for cross calibration and I exposed yesterday the problems of cross calibration between HIRES and AUGER; to do that on three different sites, at different depths, with different water clarity etc. and probably different systems will be very difficult. I really think, the management problem would be absolutely enormous. 3) Finally I just want to say a word about the costs. AUGER in terms of costing was very successful. It was budgeted at 50

A. Bettini et al. / Nuclear Instruments and Methods in Physics Research A 626-627 (2011) S72–S83

million and I think it came in at 52 million dollars at the end. Part of the cost increase was because of the currency fluctuations—the break down/ melt down the Argentine economy which we could not foresee. We used the US WBS system of doing costing, which I know doesn’t match naturally on to the European system, but I think it is worth exploring. It includes contingency and I think that is a very important thing and that it should be included. Also, though not much has been done on operating costs, I understand that there is a kind of canonical figure which lies between 5% and 15% per year. This goes back at least to the early 19th century when Herschel, when he was building his 40 foot telescope (in 1785), went to King George III with the request for money and he put in an operating cost of around 10% per year.

Touramanis: What we would like now to do go the three main more focused technical items: site, technology and configuration. I would expect this would be a more open discussion where the panel members would give some input and then we would like to hear opinions and more questions from the floor. We would like to keep it to15 min each. So let’s start from the issue of site: when, how, multi-site? Katz: The multi-site discussion has not been developed within the collaboration. It came from outside as an approach to solve various problems that are rather of a political and not of a scientific sort, and that are also related to funding. I should remind you that some of the funding commitments that are there are site dependent so that the money would only be available if the site be A or B or C. This maybe the background of this discussion. I fully agree with every single word that Alan said on multi-sites and their complications for operation, for maintaining synchronicity etc. Nevertheless since this is a topic that has been brought up, it will be studied at least scientific-wise to the end of the preparatory phase. This is simply our duty. I feel that there is ... maybe not an extremely uniform opinion inside the collaboration but a certain majority would at least physics-wise strongly prefer a single site. How this single site can be determined is not quite clear to me. It seems that for two reasons the consortium itself cannot come to a solid decision: first because it is very hard to agree, and second because the decision is related to facts that are outside the impact of the consortium itself. I think we have to foresee some sort of external committee or whatever that will impose a view. Bettini: The site selection of course is an extremely difficult issue because there are different interests amongst the members of the consortium. Some are interested to take the thing on their own site. That is completely natural. There are political issues and technical issues. Some issues are scientific such as the transparency of the water, the importance of the depth, the currents, the bioluminescence that might reduce the lifetime, and so on... These things are something that you know perfectly well, so you can put them on paper and then compare the three possible sites on this basis, but this is not al. Another aspect is the management and the organisational situation, the infrastructures that are nearby, e.g. the airports that are nearby, the climate, a number of things... whatever you like. The third aspect that might even be the most important one is the money on the table. One country might be willing. In this case there is one country that says ‘‘We want this thing at our home and we will put a certain amount of money on the table.’’ I am pretty sure that, that this will not happen if the site is A, B and C. It will not be the sum of A, B and C. It will be neither A, neither B, neither C. That to me is pretty clear. No country will put 50 million or 100 million on the table to have a third of the thing. I might be wrong...

S76

Spiering: Coming back to the committee from outside—the external review committee. We had a similar situation – but of much lower in importance – in IceCube where we did not come to a decision. The question was how to transmit the data: analog or digital? And so this discussion went on for over a year. Then we said OK we take an external committee and they make a recommendation. The recommendation was, not the DESY solution which was analog transmission, but it was the LBNL solution which was digital. It was a good decision. It was recommended from an outside independent committee. This here has of course much more impact, I agree, but I just want to welcome the idea of outside judgment. Touramanis: Actually that’s a good idea. To be honest there is a more general thing we might want all of us to keep in mind. We are a reasonably large group in KM3NeT, from a number of European countries, and we all have our own interests and it is natural that everybody has the inspiration to have this experiment done nearby in his country and everybody is proud of all the development work they have done. But at some point we may be able to decide without external committee or we may have to have one in place. But it seems to outside people, and I have heard it from a few sources, that we don’t seem to have a sort of outside overseeing body. It’s up to us to be the ones who have the discussions, who have to confront our ideas and then we have to make our decisions and sort of judge them. Now on the issue of site. I don’t know if anybody from the panel wants to say something more, or if we are ready to take some views from the audience. Petros Rapidis (NCSR ‘‘Demokritos’’, Athens, Greece): A question really to Professor Watson because you said things that were close to my heart, but I would like you to expand a little bit more than that. I’ve been in Fermilab for quite some time and have seen the AUGER activities as they went on. The question for me is how crucial was the existence of a central laboratory that served as a clearing house and as an organisational basis. One of the things that I see here is that we are missing that and effectively in that sense we have not coalesced to a single army marching to the same tune. Watson: That’s a very interesting question. It’s not so much the central institution I think, as the people involved. At Fermilab, we are, first of all blessed by John Peoples who allowed the design study to take place there for six months essentially at zero cost. This of course was not without the influence of Jim Cronin, just down the road in Chicago. Then the project management has been done absolutely superbly by Paul Mantsch – I cannot praise Paul high enough – but I also think is fair to say that the R & D work that was done on detector design at Fermilab – by Peter Mazur in particular – could not been done anywhere else. I think it was more of a coincidence in a sense that those people were in the same place, rather than in a central organisation. From the point of view of funding, we have common pool of money in the collaboration. It was agreed pretty readily that the people did not want to put it into a South American Bank. Secondly the South Americans didn’t want to put it in a US bank. We are recognized project of CERN and CERN is our bank: they charge us for that privilege, but it is actually very useful. Things like that are probably someway downstream, unfortunately, for KM3NeT but they should be thought about. Touramanis: Someone must have a comment to make. Philippe Gorodetsky (APC—University of Paris 7, France): Let me start with a comment on the physics case. I understand everything that has been said and I understand the case of looking for point sources. The thing which I am not sure about myself is that we have to be limited to the kind of sources which are known. Never forget that we have only 3–4% of the matter that is known so we have a lot of unknown. I am old enough to have seen,

S77

A. Bettini et al. / Nuclear Instruments and Methods in Physics Research A 626-627 (2011) S72–S83

Gamma Ray Bursts being observed for the first time and I believe that there are possibilities for other things, even though we do not know what they are. There is a possibility for hadronic sources with neutrinos, where photons do not come out. You just have to put on enough shielding, and you need a very tiny small amount of shielding which you have to put to not get photons out. I think to fix our minds only to what is known as an electromagnetic source is a weak point. Secondly, obviously the more likely place to look is into our Galaxy, just because of the solid angle effect, nothing deeper than that and this is a very strong effect. So I really believe that we should be careful not to be fixed to the existing physics of photons, there may be other things. This is one of the motivations to go with neutrinos. For the site, and concerning the amount of money available. If you want to put 250 million euro for kilometre cube, you need a bump somewhere, outside of the usual budget, and somebody said that the budget will not come out like that. I think this will be the limitation, and it’s likely to be solved by money which is not in the standard budget, otherwise it will not be done. This will fix a lot of things, including the site. Herve´ Pero (Research Infrastructures, DG Research, European Commission, Brussels): On the issue of selecting a site I have different comments. The fist point is that ESFRI will have very soon report on selection for sites and that the KM3NeT and EMSO partnerships should have look. Normally it will be available by December. Selecting a site will be done by politicians, unless you come with just one single site and that’s the only possibility. If not, ministers will have their say and they will not make a decision if you don’t get anything on operational cost. I’ve seen this morning that you have very clear views on construction cost—even though it is ranging from 150 to 350 million. But ministers will not take a decision if there is nothing on operational cost. For construction cost, what we know is that regional governments will be happy to pay, that’s the experience we have. The economic return is very clear: 75% for the region just for the construction. So just for having the flag and having the visibility and so on, the region will pay. The main issue is operational cost. You need to be able to budget that and we need to be able to find the political solution to have ministers funding this operational cost. And for that, what I’ve seen also today is that you talk very much – almost 100% – on science, whereas ministers, decision-makers, will not decide on science. They will decide on other impacts e.g. socio-economic impacts, impact on the training of students, impact on industry and so on—it’s good that you have a parallel session with industry. The issue of what would be the impact on industry, what will be the impact on the education and on training of people, what is the value, what is the asset of Europe, what is the human capital, all these will also be key for decisions by ministers. Touramanis: Thank you for that. We could have one more. Zhan-Arys Dzhilkibaev (INR, Moscow, Russia): I want to discuss these problems from another point of view. The principle of detection of neutrino telescopes is the following (see Slide 1): the input is the neutrinos, and the neutrinos are of three types i.e. not only muon neutrinos. Then we have the target, water or ice, and coming from the target there are muons from muon neutrinos, and new neutrinos and cascades from all three types of neutrinos. After the telescope we have measurements of the energy, the time and maybe the energy direction, and of the composition of the neutrinos. Results and output physics depend on the resolution with which these factors are determined. In the muon case, they do not strongly depend on the target and here are the achievable resolution (sO  0.3–11, and sE  0.3  log E). For cascades, they depend on the target (water or ice) properties and in this case, it is very important that in ice we have a

Slide 1. Dzhilkibaev.

Slide 2. Dzhilikbaev.

Slide 3. Dzhilikbaev.

scattering length about one order of magnitude smaller than in water. This is crucial because, as you see in Slide 2, for the angular distribution at a large distance, at 200 m, in ice with scattering we have a flat distribution. In contrast in the water we have a very strong anisotropic distribution and we may be able to reconstruct not only muons but also the cascades with high resolution (Slides 3 and 4).

A. Bettini et al. / Nuclear Instruments and Methods in Physics Research A 626-627 (2011) S72–S83

Slide 4. Dzhilikbaev.

Slide 5. Dzhilikbaev.

Slide 6. Dzhilikbaev.

I show our estimates for the GVD Detector (Slide 5), which we optimised for the cascades taking into account the muon threshold of 10 TeV. We find that we can reconstruct the vertex of the cascade with a position resolution of 2–4 m; the cascade energy resolution is (0.1–0.15)  log E, and the mismatch angle resolution is 3–61. You may compare this angle resolution with the one for

S78

muons which is somewhat less than one degree. So in the physics for which we construct and optimise our detectors we must include not only the muons but also the cascades. Cascades allow us to study the flavour composition of the neutrino flux. The flavour composition is expected to be equal in all flavours due to neutrino oscillations. The energy spectrum shown here for the neutrino electron is practically the energy of the cascade, for events containing muons it is the sum of the muon and cascade energy, and for tau neutrino events it is close to the energy of cascade (Slide 6). The cascades allow us to investigate the global anisotropy of the extra galactic sources; the local anisotropy, this is for galactic sources and for the galactic plane. For point source studies the cascades are complimentary to the muons. For transient sources, such as GRBs, in the case of cascades if we have good measurement resolution, we may be able to investigate their time and position correlation with the time and direction of the cascade. Therefore what I think is necessary is to have a detector with high resolution for these parameters and to deploy and use all possible channels of detection. The telescope has to use this features for the physics program. I think it is not so good to look only for point sources. The important is resolution of the detector! Touramanis: We have spoken about site selection. So let us move to another topic. Rapidis: It’s not a site selection argument; it’s mostly a comment about the physics. The presentation by Uli short of stopped at energies of about 100 TeV even though statements were made that we are going to be more opportunistic and also look at higher energies. I would like to say that this is the first time we are opening a new eye to the sky – this is probably a comment for Mr. Pero to carry to the authorities – and every time we open a new eye to the sky, we have seen incredible things. Along the same lines, we should open our eyes to the very large energies and getting a detector for very large energies comes at a minimal cost, so to speak, because you don’t have to have the high density of instrumentation that lower energies require. I think we should set our goals higher and we should definitely mention the PeV and even beyond PeV energy range in everything we do. Perhaps Paolo Lipari would like to comment on that. Lipari: There is no doubt that you can look for high energies, and it is certainly one of the interesting lines of research. The general problem I think of all these discussions is that that you have to set physics goals and you have to work on what is possible – this is the argument that Jean-Jacques Aubert developed – but you have to do it in the right way. The idea that you can have surprises is certainly relevant—you do all of this looking for surprises. But on the other hand, there are a lot of examples in the history of science where you looked where nobody had ever looked before and you didn’t find anything. So I think to look for PeV emission is certainly something that you should do—you should decide on this. But actually you may want first to try to look for those sources that nearly certainly are there. I think in our galaxy there are sources visible at the level of 10 km2 and several years. Hopefully in the meantime you will discover other more unexpected things which would be much more interesting. I am pretty sure that those things which are unexpected will emerge at high energy; it’s more likely in many ways. I think you have the window of say 10 TeV which is the one where point sources are, and then – but then these are speculations – if I have to bet for surprises, the surprises would more likely be at significantly higher energies. Katz: Just one sentence, directly to that. I was not very explicit on that, but of course when you optimise for this energy range, around 10–100 TeV, you will be automatically sensitive to higher energies. However if you optimise exclusively to higher energies, like beyond a PeV, then you will lose the point sources, the

S79

A. Bettini et al. / Nuclear Instruments and Methods in Physics Research A 626-627 (2011) S72–S83

galactic, the standard point sources. My feeling is that this would be a wrong decision. Spiering: I would also not say I would optimise to PeV but more to 100 or 10 TeV and have a threshold even a little bit lower. One remark on Zhan’s (Dzhilkibaev) contribution. We totally forgot about the fact that 2/3 of the signal comes as cascades and cascades have bad pointing, a little bit better in water but all bad in ice. Until now at least we don’t have good algorithms to reconstruct directions sufficiently well because they are limited, they are close to point like light emission and are not long tracks with a good level arm. But on the other hand, there is a lot of information in it and for diffuse fluxes they certainly give a very valuable contribution. For cascades indeed there is a physics reason to build one detector at one site because they go with volume and you want to a well shielded fiducial volume. So they would be really in favour of one site whereas, as far as I understood, there have been estimates that with muons you get 10% , or even 20%, better if you make it at two sites and for sufficiently high energies. But for cascades that is certainly not true. Touramanis: I think experience has shown in this field, and in many other fields, that the better you build your detector, you will find clever ways and interesting physics to do. It has been the case in colliders, everywhere. Emilio Migneco (LNS and University of Catania, Italy): I am trying to follow the discussion and to get some idea about the future activity, because – unfortunately or fortunately – I am in charge of coordinating this collaboration for the coming three years. Each morning I try to make up my mind and see what to do. I try to not to get too much depressed by this discussion. Each morning I say: ‘‘Is it interesting to study high energy on the universe?’’ and I answer ‘‘Yes it is very interesting, very exciting. I do really want to spend the rest of my scientific life in this field.’’ Also, and this is more important, the same is true for a lot of young collaborators who are spending most of their lives on this activity. It is interesting and highly scientifically important to study the high-energy universe. An interesting second question is: ‘‘Shall we study it with neutrinos? Or is enough to study it with the high energy gamma rays and high-energy cosmic rays?’’ Again we have to answer these questions each day. There are question that 10 even 20 years of activity, with enormous detectors built in these two fields with enormous progress and understanding do not solve. So we are hearing this each day, even in this conference. So the neutrino has a specific task and a specific question that it can answer, there is no doubt about this for me. I guess that this position is shared by the rest of the audience and I hope that it is also the general feeling of this round table. A third point: As it is important to start high energy neutrino astrophysics, the question is where to build a detector? The answer without doubt is in the Mediterranean Sea. Where has a detector been built? In the South Pole! Strange eh? But this is life. And we are fighting each day to convince people to build it in the Mediterranean Sea, where we can look at the right sources. Strange life but this is the case. Now, how large must it be? The answer ‘as large as possible’ is stupid. The right answer is to build it as large as is possible to have, obviously, a meaningful answer in terms of physics. During the talks and the papers I have read, one can find some mention of this. Personally I put it in terms of sensitivity at level of well below 10 12 TeV cm 2 s 1. A telescope which has a sensitivity between let’s say 5  10 13 and 10 12 TeV cm 2 s 1 is an affordable, meaningful, important contribution to start finally the high energy neutrino astronomy. That’s my answer to this question. What does it mean in terms of size? More or less what Uli has presented. There were two options. Why two options? It makes

sense because a first option can be at the level where one just scratches at the problem with a few neutrino events per sources. The second option, a step of twice as large or three times as large a detector, of course could be the step where you are allowed to make more physics and to get more information on this field. Fourth point, the site: of course the single site is an obvious option. Somebody makes this panel for decision or the council of ministers of Europe meet and decide the site. I don’t mind. But the decision is not all the story. The best part of the story is the budget. In other words, if somebody comes and tells me ‘you have 200 million Euros and this is the site’, I will say OK. If there are two of these people saying the same, or three then we can compete. But if it is only one and his proposal is meaningful it is not a clear drawback. It’s OK. Touramanis: Believe me if you find that the UK will also join in, if you find the place where there’s 200 million Euros on the table the STFC of the UK will back this also. Migneco: But let me add, if it is not the case that this wise person is not at our disposal for the moment, and that there is a group of less wise persons that propose a site with less budget, then I as the coordinator of the preparatory phase, I feel the duty to investigate deeply this option. If there is nobody which presents a strong argument against – not just of the type ‘we like a single site because is better’ but a stronger, clear, scientific technical argument against – the multi-site option for me is still on the table. That’s the end of my contribution. Touramanis: Thank you for that. At this point we have a choice to make, either we touch a little bit the other two points, on how to decide on the technical things other than the site or if we go on any more then we’ll need either more time or we’ll have to skip those. Given the schedule constraints maybe we should now go to the third point which is the sort of optical modules, the technology and so on. Again maybe Uli (Katz) can remind us what the plan is and that might be less controversial. Katz: It’s what I presented previously. There are a few technical decisions to be made but need further prototyping, further studies in order to be on solid ground. I think we have experienced cases where it has been shown that the learning curve that leads finally to a sound technology can be rather painful and also takes some time. We should optimise and optimising means not to take premature decisions. The optical modules and the mechanical structures will have to be prototype tested and further explored over the next 18–24 months. That is true – as Emilio said and pointed out before – also for a lot of those technical design options of which we are rather optimistic that they will work but since they are new, they have to be developed further, compared to what we use in existing experiments. They also have to be verified, they have to be tested, and they also have to be assessed in terms of risk and cost etc. So all this has to be done in the next 18–24 months and it should give us a sound basis for fixing and freezing the design. As I said before, freezing the configuration in terms of geometry may come a little later – not very much later – because simply operationally it has to be done before we start purchasing the components. Of course one dimension that has not been addressed in my talk deliberately and also only partly in this discussion is that there is a second arrow in time, which is the availability of resources. The funding must be there, otherwise we cannot do much. Even prototyping certainly requires more funding than is obviously available for the consortium. So I think this is a second aspect which has to be looked at very carefully. I think Emilio has a huge task in coordinating that in the preparatory phase. Alan Ball (CERN, Geneva, Switzerland): Maybe a silly question, I have not been very involved recently, but I just wonder when the idea of a star configuration was dropped, and why.

A. Bettini et al. / Nuclear Instruments and Methods in Physics Research A 626-627 (2011) S72–S83

Leonidas Resvanis (NESTOR Institute and University of Athens, Greece): We did not put it on the table finally, not because we don’t believe that it is a superior solution, but in the interest of convergence. Bettini: May I ask something: In the next two or three years you have to converge on the design and that costs money, for example for prototyping. You have money I understand on FP7 that is 5 million, and that is not enough for everything. Resvanis: Not for prototyping, prototyping was excluded. Bettini: Do you foresee to have a converging process to request money to the various funding agencies specific for the KM3NeT? Touramanis: I think Uli (Katz) is the best person to answer. Katz: First of all I should say that some money is available. There is substantial funding in the Netherlands for pursuing prototyping. I understand that there is money from INFN to pursue prototyping, so that there are some resources but we definitely foresee that on the basis of the TDR also other countries will go the next step, i.e. get money for this prototyping, research and development phase until design is frozen. Bettini: I have a small experience. I have looked into some proposals of some European country, I cannot mention which, for this type of activity. They were clearly not committed to something coordinated within the KM3NeT. Touramanis: I think we can do with more and better coordination. I think the management of the collaboration so far has done a fantastic job. We are all Mediterranean which means that all sites offer nice weather but also that we all have temperament, and we have to live with it. We don’t have the time to go into technical details, e.g. of this photomultiplier versus that photomultiplier etc. I think we have some assurance that the process is there and the money is there. As maybe as the last item we may want to look a little bit on the configuration, which has also to do with physics. The things mentioned before were about containment, energy, measurement, etc. Who would like to make a comment on that? Katz: I think much has been said on configuration starting from Christan’s (Spiering) talk, to what I said today, about the connection between configuration and the ongoing process. Configuration studies per se are quite complicated and time consuming, because there are an infinite number of configurations that you might want to study in the end, with an infinite dimensional parameters space. So scanning that is not a good idea. You need educated guesses to start with. I think there are some educated guesses that you look at, that we are looking at, but I think a very sound result can only be achieved once the technical design has been frozen or close to being frozen. Spiering: First I was really surprised and kind of happy to see that you converged at something like 180 m separation [between optical modules], which was exactly where we stopped. I would have expected that the optimum for an E-2 source is even a little bit higher. Of course you can’t go much higher if you don’t collect enough light. In that context my question is for which of the many photomultiplier options was this optimised? Katz: As I said in my presentation what I showed was for the flexible tower and it was for classical photomultipliers. Please be aware that per detection unit the photocathode area for the towers is double of that of the strings, because they have 6 times as many optical modules. It all boils down to: the leading parameter is the overall photocathode area. If you sum it up for a different configuration it comes out at a similar magnitude. I should also say, the simulations for the multi-photomultiplier optical modules is maybe lacking behind a little bit because it turned out to be quite tricky and difficult to implement it in the existing algorithms and there is certainly work to done. Touramanis: Right, I think Paolo (Lipari) was in line.

S80

Lipari: I want to make a comment which is a little bit outside this stream. I wanted to somehow address the passionate comments of Emilio (Migneco) before. The point I want to raise is a question about interdisciplinary studies. When you make these instruments, which are exploratory, this is a possibility to make these interdisciplinary studies happen. Actually I’m sorry that I ended up in this meeting and not hearing anything about these studies. The results on the observations near Catania received a lot of attention in the press and this was very nice. If you have three points will you make measurements and maybe you can learn something important. This issue should not just be used for image but for substance. I really have very high opinion of the work that people in Catania are doing, and I understand that now you have these three sites and you are doing prototyping which will take some time. I was saying it is possible that you could develop a lot of interdisciplinary work. There multi-site is really a big advantage. It is clear that now you have to exploit these three sites for prototyping. I don’t know which Chinese said, Laozi maybe?, ‘‘the perfect path cannot be walked, you have to walk where is possible.’’ I feel guilty: as a theorist to come here and say one site is better. You just do it yourself. These two years of prototyping can really exploit these three sites and then maybe you can converge on the light of these things. The interdisciplinary studies could benefit from this transition phase. Touramanis: Uli (Katz) has a comment. Katz: Just a short comment. I agree with that, and marine science groups were deliberately included in the design study from the very beginning. I think these aspects were taken very seriously. The only thing that you have to keep in mind is that what we are designing is a research infrastructure that provides a node to these communities. We do not design the marine and earth science instruments. That is a different community and a different task. Therefore we usually do not detail so much the measurements that will be done. But you are right they are of course highly interesting and they should also be related to the prototyping activities. Touramanis: I certainly liked the last comments as most of the things we have heard. One comment on your previous reply; I would like to clarify the issue of the first question. The question was not do we really care to do it, because clearly anybody who does not seriously care to do it would not have been here. That was not the question, the question was that given that we want to do it and we have already convinced ourselves, how do we make sure that we can convey that enthusiasm and persuade other people? That would be the people from the street as we tried to do last night with the public talk or the press coverage. It is very important to actually convince the people that make the decisions. Alexander Kappes (University of Erlangen-Nuremberg, Germany): I have a question to Christian with what he just said. You said that you were very happy when you saw that this optimisation came at 180 m, but this is for an optimisation for the E-2 spectrum. So you are optimising really for the high end part whereas as you said before – we all agree that – the galactic sources are important, which are on 10 TeV. So if you optimise for these, if you have to go to 180 m and you optimise for E-2, you lose significantly at the galactic sources. Somehow there seems to be a contradiction. Spiering: For some of the galactic sources! Remember when you investigated the Milagro sources and there you came to the conclusion that a cut of 30 TeV is the best one. Supernova remnants are different, and of course everything has to be taken with a grain of salt. I did not say that you should cut at 30 TeV for every source, but certainly going lower in your sensitivity below than one TeV makes certainly no sense, even not for SNR 1713.

S81

A. Bettini et al. / Nuclear Instruments and Methods in Physics Research A 626-627 (2011) S72–S83

Kappes: One TeV is clear but 10 TeV is not. Spiering: In that context I said 1 to 3. I can show you on my slides. Kappes: OK Touramanis: I guess at this point we can close the session. I would like to thank, first of all the organisers for trusting me to do this, but I would like to thank very much the panel members and everybody for having a useful discussion on a number of things that are in our heads. We can get ideas and material to discuss over lunch which is very important. Thank you very much!

Appendix A. Slides of Christian Spiering See Figs. A1–A5.

Appendix B. Can KM3NeT learn from the Pierre Auger observatory experience?1

Fig. A1. Spiering—Slide 1.

A. A. Watson, School of Physics and Astronomy, University of Leeds, Leeds LS2 9JT, UK B.1. Introduction During the ‘Round Table Discussion’ at the recent International Workshop on Very Large Volume Neutrino Telescopes, I was asked to make some comments reflecting on whether there was anything that the KM3NeT Collaboration might learn from the experiences encountered by those of us who developed the Pierre Auger Collaboration and Observatory. I am a tremendous admirer of the extraordinary vision, imagination, skill and dedication shown by those driving KM3NeT and the comments below are made in the belief that they may help the goals of this very ambitious project to be achieved. B.2. The Pierre Auger Observatory The southern branch of the Pierre Auger Observatory was completed in June 2008: funding bids are currently been considered for a northern site. The collaboration consists of about 400 physicists from around 100 institutions from 18 countries. The project was conceived by Jim Cronin and myself in 1991 and had developed to the stage of making decisions on sites in late 1995. The original plan had been to build Northern and Southern Observatories simultaneously, each covering 3000 km2, but a decision in 1998 by a joint NSF/DOE committee, SAGENAP, gave US money for only one site with the directive that it was to be built in Argentina. Following a ground-breaking ceremony in March 1999, construction of an Engineering Array took place during 2000/ 2001 with the aim of demonstrating the functionality of the subsystems. This was a requirement of the US funding agencies. An international panel reported favourably on progress with this prototype late in 2001 and a green light was given to start construction. Data-taking began in January 2004 and has continued with essentially 100% on-time for the surface detectors as more and more were added. The Observatory contains 1600  12 tonne water-Cherenkov detectors laid out over 3000 km2, overlooked by 24 fluorescence telescopes, in groups of 6 and located on 4 hills at the boundary of the surface array. There are about 15,000 photomultipliers in the instruments. The total capital cost was $50 M and the project was completed within budget, though less rapidly than had been 1

Prof. Watson submitted this written summary of his remarks (Ed.).

Fig. A2. Spiering—Slide 2.

Fig. A3. Spiering—Slide 3.

envisaged. The slow rate of completion was caused by many reasons, a major one being cash flow which made it difficult to order large quantities of the more expensive components at a

A. Bettini et al. / Nuclear Instruments and Methods in Physics Research A 626-627 (2011) S72–S83

S82

most energetic and above  55 EeV the rate is only about 2 per month. Thus the prospects of ever getting a spectrum from a single source are remote. It is not feasible to increase the southern area by more than   2 because of the availability of land near the present site and a survey of the northern sky remains highly desirable. Accordingly the plan for Auger North is to build an array of  21000 km2 in South Eastern Colorado. The scale is set by the availability of land and what seems to be the maximum sum that we might expect to get ( 100 Mh). The message from Auger for km3net then must surely be BUILD IT BIG. Unless there are clear indications from IceCube of signals, then even a factor of 3 larger than IceCube may not be enough. Theoretical predictions are often a poor guide. The prediction of the Greisen–Zatsepin–Kuz’min effect was a reliable prediction about spectral shape but not about the flux. The track records of phenomenologists tend to show that they are often optimists about flux estimates of anything. Fig. A4. Spiering—Slide 4.

Fig. A5. Spiering Slide 5.

given time. There were also delays caused by difficulties of land access and the devaluation of the Argentinian peso in early 2002: these became coupled when non-Argentinians bought up large tracts of land with US dollars.

B.3. Size of the observatory The original plan was that the surface arrays in each hemisphere would cover 3000 km2. This was  30 greater than the AGASA area and   2 larger than HiRes at their highest energies, but with a 100% on-time. At the time of discussions about the size it was widely believed that a substantial number of events had been recorded with energies above 1020 eV, and in particular events of 2 and 3  1020 eV had been reported from AGASA and Fly’s Eye, respectively. A constant question that Cronin and I faced when we were trying to form the collaboration was ‘Why do you want to make it so large?’ This was not an easy question to answer even with the indications from previous work (the Fly’s Eye event of 3  1020 eV was reported in 1993). We pressed successfully however for 3000 km2 but what we have discovered is that this is much too small. It seems that earlier experimental estimates of the shower energies derived from surface detectors such as AGASA were too high. The most interesting events are

B.4. Number of sites A survey of possible sites for the Auger Observatory was carried out worldwide in both hemispheres with the criteria of latitude, altitude, area, flatness, ease of deployment, optical quality, hills for fluorescence detectors, national funding prospects, local infrastructure, etc. being considered. For the Southern Hemisphere suitable sites in Argentina (2), Australia and South Africa were indentified. In November 1995 about 80 scientists from 19 countries met in the UNESCO headquarters in Paris and after presentations from representatives of the three countries with prospective sites, a vote was taken with each of the 19 countries represented having one vote. There was a very clear majority in favour of Argentina. The size of the majority was helpful and there was no bitterness. The South Africans did not remain in the project but this was more because of the proximity of H.E.S.S. and the remoteness of Argentina than because of the site decision. Of course at this stage there was no money available and several other countries present at this site-selection stage did not continue in the project, although some did return after construction had started and others who were not involved in the early stages joined later. The idea of having the project spread over two or three sites was never discussed. It never crossed anyone’s mind. I would strongly advise the km3net collaboration against a multi-site approach. There are many reasons why a multi-site solution does not seem to me to be a good idea. Here are a few:

         

Work at each site would surely proceed at different rates. There would be unhealthy competition. It might be hard to keep all designs the same. Three infrastructures would be required. Would there really be three allocations of regional money? There will be a loss of economies of scale. Edge effects will be a problem and surely degrade energy resolution. Energy measurements are tough anyway and would surely be worse with smaller volumes. Cross-calibrations will be needed (e.g. three pointing arrays would be required). Project management might require three project managers.

B.5. Capital cost estimates The cost estimates for the Auger Observatory was made using the Work Breakdown Structure (WBS) commonly favoured for large

S83

A. Bettini et al. / Nuclear Instruments and Methods in Physics Research A 626-627 (2011) S72–S83

projects in the US. While this does not map well on to European practices, largely because of the way in which staff are costed, it did work. Ways around the European problem were found. A great deal of attention was given to a contingency which was included in the budget. A contingency was calculated on each item (the WBS went down to around 7 levels for each major piece of equipment) with a contingency being assigned according to whether there was a vendor’s quote, an engineer’s estimate or a physicists guess (+50%). This was tiresome to do but did lead to the project coming in on budget and, apart from readily understood difficulties arising from the x 4 devaluation of the Argentinian peso which led to the Argentinian contribution being reduced by around $8 M, there was no need to go back to agencies asking for more money because of over-spends. This is a very important. B.6. Running costs It is difficult to make an accurate estimate of running costs for a unique device. We were guided by data from the shower project

in the UK (Haverah Park) that had operated for over 20 years and also by the canonical range 5–15% that is found for accelerators, telescopes, etc. Auger running costs are towards the low end of this range. I suspect that for km3net, the annual costs will be close to the top end, certainly in early years. As an aside it is interesting to note that when Herschel was seeking £1385 from George III in the late 18th Century he also asked for operating costs for the first year after construction getting a further £150. Clearly 5–15% is a good ball-park figure.

B.7. Prototyping The Auger Collaboration learned a great deal about operating costs and deployment from the prototyping that was done for the Engineering Array. The complexity of the possible string designs for km3net would seem to me to make it imperative for test deployments of the possible designs to be made at the selected site before a final decision is taken on design.