Skip to main content

Home/ Advanced Concepts Team/ Group items tagged source

Rss Feed Group items tagged

tvinko

Massively collaborative mathematics : Article : Nature - 28 views

  •  
    peer-to-peer theorem-proving
  • ...14 more comments...
  •  
    Or: mathematicians catch up with open-source software developers :)
  •  
    "Similar open-source techniques could be applied in fields such as [...] computer science, where the raw materials are informational and can be freely shared online." ... or we could reach the point, unthinkable only few years ago, of being able to exchange text messages in almost real time! OMG, think of the possibilities! Seriously, does the author even browse the internet?
  •  
    I do not agree with you F., you are citing out of context! Sharing messages does not make a collaboration, nor does a forum, .... You need a set of rules and a common objective. This is clearly observable in "some team", where these rules are lacking, making team work inexistent. The additional difficulties here are that it involves people that are almost strangers to each other, and the immateriality of the project. The support they are using (web, wiki) is only secondary. What they achieved is remarkable, disregarding the subject!
  •  
    I think we will just have to agree to disagree then :) Open source developers have been organizing themselves with emails since the early '90s, and most projects (e.g., the Linux kernel) still do not use anything else today. The Linux kernel mailing list gets around 400 messages per day, and they are managing just fine to scale as the number of contributors increases. I agree that what they achieved is remarkable, but it is more for "what" they achieved than "how". What they did does not remotely qualify as "massively" collaborative: again, many open source projects are managed collaboratively by thousands of people, and many of them are in the multi-million lines of code range. My personal opinion of why in the scientific world these open models are having so many difficulties is that the scientific community today is (globally, of course there are many exceptions) a closed, mostly conservative circle of people who are scared of changes. There is also the fact that the barrier of entry in a scientific community is very high, but I think that this should merely scale down the number of people involved and not change the community "qualitatively". I do not think that many research activities are so much more difficult than, e.g., writing an O(1) scheduler for an Operating System or writing a new balancing tree algorithm for efficiently storing files on a filesystem. Then there is the whole issue of scientific publishing, which, in its current form, is nothing more than a racket. No wonder traditional journals are scared to death by these open-science movements.
  •  
    here we go ... nice controversy! but maybe too many things mixed up together - open science journals vs traditional journals, conservatism of science community wrt programmers (to me one of the reasons for this might be the average age of both groups, which is probably more than 10 years apart ...) and then using emailing wrt other collaboration tools .... .... will have to look at the paper now more carefully ... (I am surprised to see no comment from José or Marek here :-)
  •  
    My point about your initial comment is that it is simplistic to infer that emails imply collaborative work. You actually use the word "organize", what does it mean indeed. In the case of Linux, what makes the project work is the rules they set and the management style (hierachy, meritocracy, review). Mailing is just a coordination mean. In collaborations and team work, it is about rules, not only about the technology you use to potentially collaborate. Otherwise, all projects would be successful, and we would noy learn management at school! They did not write they managed the colloboration exclusively because of wikipedia and emails (or other 2.0 technology)! You are missing the part that makes it successful and remarkable as a project. On his blog the guy put a list of 12 rules for this project. None are related to emails, wikipedia, forums ... because that would be lame and your comment would make sense. Following your argumentation, the tools would be sufficient for collaboration. In the ACT, we have plenty of tools, but no team work. QED
  •  
    the question on the ACT team work is one that is coming back continuously and it always so far has boiled down to the question of how much there need and should be a team project to which everybody inthe team contributes in his / her way or how much we should leave smaller, flexible teams within the team form and progress, more following a bottom-up initiative than imposing one from top-down. At this very moment, there are at least 4 to 5 teams with their own tools and mechanisms which are active and operating within the team. - but hey, if there is a real will for one larger project of the team to which all or most members want to contribute, lets go for it .... but in my view, it should be on a convince rather than oblige basis ...
  •  
    It is, though, indicative that some of the team member do not see all the collaboration and team work happening around them. We always leave the small and agile sub-teams to form and organize themselves spontaneously, but clearly this method leaves out some people (be it for their own personal attitude or be it for pure chance) For those cases which we could think to provide the possibility to participate in an alternative, more structured, team work where we actually manage the hierachy, meritocracy and perform the project review (to use Joris words).
  •  
    I am, and was, involved in "collaboration" but I can say from experience that we are mostly a sum of individuals. In the end, it is always one or two individuals doing the job, and other waiting. Sometimes even, some people don't do what they are supposed to do, so nothing happens ... this could not be defined as team work. Don't get me wrong, this is the dynamic of the team and I am OK with it ... in the end it is less work for me :) team = 3 members or more. I am personally not looking for a 15 member team work, and it is not what I meant. Anyway, this is not exactly the subject of the paper.
  •  
    My opinion about this is that a research team, like the ACT, is a group of _people_ and not only brains. What I mean is that people have feelings, hate, anger, envy, sympathy, love, etc about the others. Unfortunately(?), this could lead to situations, where, in theory, a group of brains could work together, but not the same group of people. As far as I am concerned, this happened many times during my ACT period. And this is happening now with me in Delft, where I have the chance to be in an even more international group than the ACT. I do efficient collaborations with those people who are "close" to me not only in scientific interest, but also in some private sense. And I have people around me who have interesting topics and they might need my help and knowledge, but somehow, it just does not work. Simply lack of sympathy. You know what I mean, don't you? About the article: there is nothing new, indeed. However, why it worked: only brains and not the people worked together on a very specific problem. Plus maybe they were motivated by the idea of e-collaboration. No revolution.
  •  
    Joris, maybe I made myself not clear enough, but my point was only tangentially related to the tools. Indeed, it is the original article mention of "development of new online tools" which prompted my reply about emails. Let me try to say it more clearly: my point is that what they accomplished is nothing new methodologically (i.e., online collaboration of a loosely knit group of people), it is something that has been done countless times before. Do you think that now that it is mathematicians who are doing it makes it somehow special or different? Personally, I don't. You should come over to some mailing lists of mathematical open-source software (e.g., SAGE, Pari, ...), there's plenty of online collaborative research going on there :) I also disagree that, as you say, "in the case of Linux, what makes the project work is the rules they set and the management style (hierachy, meritocracy, review)". First of all I think the main engine of any collaboration like this is the objective, i.e., wanting to get something done. Rules emerge from self-organization later on, and they may be completely different from project to project, ranging from almost anarchy to BDFL (benevolent dictator for life) style. Given this kind of variety that can be observed in open-source projects today, I am very skeptical that any kind of management rule can be said to be universal (and I am pretty sure that the overwhelming majority of project organizers never went to any "management school"). Then there is the social aspect that Tamas mentions above. From my personal experience, communities that put technical merit above everything else tend to remain very small and generally become irrelevant. The ability to work and collaborate with others is the main asset the a participant of a community can bring. I've seen many times on the Linux kernel mailing list contributions deemed "technically superior" being disregarded and not considered for inclusion in the kernel because it was clear that
  •  
    hey, just catched up the discussion. For me what is very new is mainly the framework where this collaborative (open) work is applied. I haven't seen this kind of working openly in any other field of academic research (except for the Boinc type project which are very different, because relying on non specialists for the work to be done). This raise several problems, and mainly the one of the credit, which has not really been solved as I read in the wiki (is an article is written, who writes it, what are the names on the paper). They chose to refer to the project, and not to the individual researchers, as a temporary solution... It is not so surprising for me that this type of work has been first done in the domain of mathematics. Perhaps I have an ideal view of this community but it seems that the result obtained is more important than who obtained it... In many areas of research this is not the case, and one reason is how the research is financed. To obtain money you need to have (scientific) credit, and to have credit you need to have papers with your name on it... so this model of research does not fit in my opinion with the way research is governed. Anyway we had a discussion on the Ariadnet on how to use it, and one idea was to do this kind of collaborative research; idea that was quickly abandoned...
  •  
    I don't really see much the problem with giving credit. It is not the first time a group of researchers collectively take credit for a result under a group umbrella, e.g., see Nicolas Bourbaki: http://en.wikipedia.org/wiki/Bourbaki Again, if the research process is completely transparent and publicly accessible there's no way to fake contributions or to give undue credit, and one could cite without problems a group paper in his/her CV, research grant application, etc.
  •  
    Well my point was more that it could be a problem with how the actual system works. Let say you want a grant or a position, then the jury will count the number of papers with you as a first author, and the other papers (at least in France)... and look at the impact factor of these journals. Then you would have to set up a rule for classifying the authors (endless and pointless discussions), and give an impact factor to the group...?
  •  
    it seems that i should visit you guys at estec... :-)
  •  
    urgently!! btw: we will have the ACT christmas dinner on the 9th in the evening ... are you coming?
Francesco Biscani

Tom Sawyer, whitewashing fences, and building communities online - 3 views

  • If you are looking to ideas like open source or social media as simple means to get what you want for your company, it’s time to rethink your community strategy.
  • I’ve talked to people at companies who are considering “open sourcing” their product because they think there is an army of people out there who will jump at the chance to build their products for them. Many of these people go on to learn tough but valuable lessons in building community. It’s not that simple.
  •  
    Illuminating article about corporations trying to exploit "open source" and not getting what they want.
  •  
    I like the red had definition: "To be the catalyst in communities of customers, contributors, and partners creating better technology the open source way."
  •  
    yeah, it is the same with crowdsourcing in general, when some company "managers" see how much cheaper they could do it but don't understand where it comes from...
LeopoldS

Open Source - Corporate - Aldebaran Robotics | Key Features - 3 views

  •  
    anybody of you already playing around with this open source toy?
  • ...1 more comment...
  •  
    nobody?
  •  
    My son Patrick has done his training period in MIT on this NAO. type "NAO Bechon" and you will get some results including a nice video...
  •  
    Roughly half of the researchers in robotics I know work on Naos, including those in my lab...
jcunha

Superfast light source made from artificial atom - 0 views

  •  
    A new more efficient type of single photon light source consisting of a quantum dot reproduces 1954 Robert Dicke theoretical proposal. Applications in quantum communications directly on the target. "All light sources work by absorbing energy - for example, from an electric current - and emit energy as light. But the energy can also be lost as heat and it is therefore important that the light sources emit the light as quickly as possible, before the energy is lost as heat."
LeopoldS

Global Innovation Commons - 4 views

  •  
    nice initiative!
  • ...6 more comments...
  •  
    Any viral licence is a bad license...
  •  
    I'm pretty confident I'm about to open a can of worms, but mind explaining why? :)
  •  
    I am less worried about the can of worms ... actually eager to open it ... so why????
  •  
    Well, the topic GPL vs other open-source licenses (e.g., BSD, MIT, etc.) is old as the internet and it has provided material for long and glorious flame wars. The executive summary is that the GPL license (the one used by Linux) is a license which imposes some restrictions on the way you are allowed to (re)use the code. Specifically, if you re-use or modify GPL code and re-distribute it, you are required to make it available again under the GPL license. It is called "viral" because once you use a bit of GPL code, you are required to make the whole application GPL - so in this sense GPL code replicates like a virus. On the other side of the spectrum, there are the so-called BSD-like licenses which have more relaxed requirements. Usually, the only obligation they impose is to acknowledge somewhere (e.g., in a README file) that you have used some BSD code and who wrote it (this is called "attribution clause"), but they do not require to re-distribute the whole application under the same license. GPL critics usually claim that the license is not really "free" because it does not allow you to do whatever you want with the code without restrictions. GPL proponents claim that the requirements imposed by the GPL are necessary to safeguard the freedom of the code, in order to avoid being able to re-use GPL code without giving anything back to the community (which the BSD license allow: early versions of Microsoft Windows, for instance, had the networking code basically copy-pasted from BSD-licensed versions of Unix). In my opinion (and this point is often brought up in the debates) the division pro/against GPL mirrors somehow the division between anti/pro anarchism. Anarchists claim that the only way to be really free is the absence of laws, while non-anarchist maintain that the only practical way to be free is to have laws (which by definition limit certain freedoms). So you can see how the topic can quickly become inflammatory :) GPL at the current time is used by aro
  •  
    whoa, the comment got cut off. Anyway, I was just saying that at the present time the GPL license is used by around 65% of open source projects, including the Linux kernel, KDE, Samba, GCC, all the GNU utils, etc. The topic is much deeper than this brief summary, so if you are interested in it, Leopold, we can discuss it at length in another place.
  •  
    Thanks for the record long comment - am sure that this is longest ever made to an ACT diigo post! On the topic, I would rather lean for the GPL license (which I also advocated for the Marek viewer programme we put on source forge btw), mainly because I don't trust that open source is by nature delivering a better product and thus will prevail but I still would like to succeed, which I am not sure it would if there were mainly BSD like licenses around. ... but clearly, this is an outsider talking :-)
  •  
    btw: did not know the anarchist penchant of Marek :-)
  •  
    Well, not going into the discussion about GPL/BSD, the viral license in this particular case in my view simply undermines the "clean and clear" motivations of the initiative authors - why should *they* be credited for using something they have no rights for? And I don't like viral licences because they prevent using things released under this licence to all those people who want to release their stuff under a different licence, thus limiting the usefulness of the stuff released on that licence :) BSD is not a perfect license too, it also had major flaws And I'm not an anarchist, lol
Dario Izzo

File Compression: New Tool for Life Detection? - 4 views

  •  
    As mentioned today during coffee .... we could think to link this to source localization
  • ...3 more comments...
  •  
    Not sure by what you mean by source localisation, but this using gzip to discern "biological" from "non-biological" images seems to me *very* tricky... I mean, there's a lot of other factors that may affect compressibility of an image than just mere "regularity" of the pattern, and if they haven't controlled for these, this is just bullsh1t... (For instance did they use the same imaging device to take those images? What about lighting conditions and exposure? etc). The apostle of sometimes surprising uses of compression is prof. Shmidhuber from IDSIA...
  •  
    I completely agree with you..... still if you have one instrument on board the spacecraft and your picture compressibility is a noisy indicator of some interesting source .... we could try to perform some probabilistic reasoning
  •  
    I think they (IDSIA-Schmidhuber) are planning on putting something about that also inside the Acta Futura paper...
  •  
    Really, you think they'd target such a low impact factor publication? ;-P
  •  
    you will all soon be begging to publish in Acta Futura! We will be bigger than Nature.
Luís F. Simões

Open Source Hardware Hits 1.0 - Slashdot - 1 views

  • Open Source Hardware is a term for tangible artifacts — machines, devices, or other physical things — whose design has been released to the public in such a way that anyone can make, modify, distribute, and use those things.
Francesco Biscani

Apollo 11 Source Code on GoogleCode | Lambda the Ultimate - 0 views

  • TC BANKCALL # TEMPORARY, I HOPE HOPE HOPE CADR STOPRATE # TEMPORARY, I HOPE HOPE HOPE
  •  
    Excellent comments from the Apollo original source code:)
Francesco Biscani

Google Code Blog: Apollo 11 mission's 40th Anniversary: One large step for open source ... - 0 views

  •  
    See, open source works :)
ESA ACT

SAGE: Open Source Mathematics Software - 0 views

  •  
    Creating a viable free open source alternative to
Athanasia Nikolaou

Nature Paper: Rivers and streams release more CO2 than previously believed - 6 views

  •  
    Another underestimated source of CO2, are turbulent waters. "The stronger the turbulences at the water's surface, the more CO2 is released into the atmosphere. The combination of maps and data revealed that, while the CO2 emissions from lakes and reservoirs are lower than assumed, those from rivers and streams are three times as high as previously believed." Alltogether the emitted CO2 equates to roughly one-fifth of the emissions caused by humans. Yet more stuff to model...
  • ...10 more comments...
  •  
    This could also be a mechanism to counter human CO2 emission ... the more we emit, the less turbulent rivers and stream, the less CO2 is emitted there ... makes sense?
  •  
    I guess there is a natural equilibrium there. Once the climate warms up enough for all rivers and streams to evaporate they will not contribute CO2 anymore - which stops their contribution to global warming. So the problem is also the solution (as always).
  •  
    "The source of inland water CO2 is still not known with certainty and new studies are needed to research the mechanisms controlling CO2 evasion globally." It is another source of CO2 this one, and the turbulence in the rivers is independent of our emissions in CO2 and just facilitates the process of releasing CO2 waters. Dario, if I understood correct you have in mind a finite quantity of CO2 that the atmosphere can accomodate, and to my knowledge this does not happen, so I cannot find a relevant feedback there. Johannes, H2O is a powerful greenhouse gas :-)
  •  
    Nasia I think you did not get my point (a joke, really, that Johannes continued) .... by emitting more CO2 we warm up the planet thus drying up rivers and lakes which will, in turn emit less CO2 :) No finite quantity of CO2 in the atmosphere is needed to close this loop ... ... as for the H2O it could just go into non turbulent waters rather than staying into the atmosphere ...
  •  
    Really awkward joke explanation: I got the joke of Johannes, but maybe you did not get mine: by warming up the planet to get rid of the rivers and their problems, the water of the rivers will be accomodated in the atmosphere, therefore, the greenhouse gas of water.
  •  
    from my previous post: "... as for the H2O it could just go into non turbulent waters rather than staying into the atmosphere ..."
  •  
    I guess the emphasis is on "could"... ;-) Also, everybody knows that rain is cold - so more water in the atmosphere makes the climate colder.
  •  
    do you have the nature paper also? looks like very nice, meticulous typically german research lasting over 10 years with painstakingly many researchers from all over the world involved .... and while important the total is still only 20% of human emissions ... so a variation in it does not seem to change the overall picture
  •  
    here is the nature paper : http://www.nature.com/nature/journal/v503/n7476/full/nature12760.html I appreciate Johannes' and Dario's jokes, since climate is the common ground that all of us can have an opinion, taking honours from experiencing weather. But, the same as if I am trying to make jokes for material science, or A.I. I take a high risk of failing(!) :-S Water is a greenhouse gas, rain rather releases latent heat to the environment in order to be formed, Johannes, nice trolling effort ;-) Between this and the next jokes to come, I would stop to take a look here, provided you have 10 minutes: how/where rain forms http://www.scribd.com/doc/58033704/Tephigrams-for-Dummies
  •  
    omg
  •  
    Nasia, I thought about your statement carefully - and I cannot agree with you. Water is not a greenhouse gas. It is instead a liquid. Also, I can't believe you keep feeding the troll! :-P But on a more topical note: I think it is an over-simplification to call water a greenhouse gas - water is one of the most important mechanisms in the way Earth handles heat input from the sun. The latent heat that you mention actually cools Earth: solar energy that would otherwise heat Earth's surface is ABSORBED as latent heat by water which consequently evaporates - the same water condenses into rain drops at high altitudes and releases this stored heat. In effect the water cycle is a mechanism of heat transport from low altitude to high altitude where the chance of infrared radiation escaping into space is much higher due to the much thinner layer of atmosphere above (including the smaller abundance of greenhouse gasses). Also, as I know you are well aware, the cloud cover that results from water condensation in the troposphere dramatically increases albedo which has a cooling effect on climate. Furthermore the heat capacity of wet air ("humid heat") is much larger than that of dry air - so any advective heat transfer due to air currents is more efficient in wet air - transporting heat from warm areas to a natural heat sink e.g. polar regions. Of course there are also climate heating effects of water like the absorption of IR radiation. But I stand by my statement (as defended in the above) that rain cools the atmosphere. Oh and also some nice reading material on the complexities related to climate feedback due to sea surface temperature: http://journals.ametsoc.org/doi/abs/10.1175/1520-0442(1993)006%3C2049%3ALSEOTR%3E2.0.CO%3B2
  •  
    I enjoy trolling conversations when there is a gain for both sides at the end :-) . I had to check upon some of the facts in order to explain my self properly. The IPCC report states the greenhouse gases here, and water vapour is included: http://www.ipcc.ch/publications_and_data/ar4/wg1/en/faq-2-1.html Honestly, I read only the abstract of the article you posted, which is a very interesting hypothesis on the mechanism of regulating sea surface temperature, but it is very localized to the tropics (vivid convection, storms) a region of which I have very little expertise, and is difficult to study because it has non-hydrostatic dynamics. The only thing I can comment there is that the authors define constant relative humidity for the bottom layer, supplied by the oceanic surface, which limits the implementation of the concept on other earth regions. Also, we may confuse during the conversation the greenhouse gas with the Radiative Forcing of each greenhouse gas: I see your point of the latent heat trapped in the water vapour, and I agree, but the effect of the water is that it traps even as latent heat an amount of LR that would otherwise escape back to space. That is the greenhouse gas identity and an image to see the absorption bands in the atmosphere and how important the water is, without vain authority-based arguments that miss the explanation in the end: http://www.google.nl/imgres?imgurl=http://www.solarchords.com/uploaded/82/87-33833-450015_44absorbspec.gif&imgrefurl=http://www.solarchords.com/agw-science/4/greenhouse--1-radiation/33784/&h=468&w=458&sz=28&tbnid=x2NtfKh5OPM7lM:&tbnh=98&tbnw=96&zoom=1&usg=__KldteWbV19nVPbbsC4jsOgzCK6E=&docid=cMRZ9f22jbtYPM&sa=X&ei=SwynUq2TMqiS0QXVq4C4Aw&ved=0CDkQ9QEwAw
LeopoldS

Tox: A New Kind of Instant Messaging - 5 views

shared by LeopoldS on 02 Sep 14 - No Cached
  •  
    skype alternative - open source, no central server, encryption built in ....
  • ...4 more comments...
  •  
    It's free and w/o ads. What's the business model? Their page doesn't say anything about it.
  •  
    To help society...
  •  
    They plan to secretly capture all communications and then sell them to NSA...
  •  
    probably developed by the NSA directly
  •  
    its open source - go check it :-)
  •  
    my ID: 7C53B574D888EE0E2A97FCD62B144DD14730E45C1B7158D4ED3EBCCB920CB93A68C62E6C9385
LeopoldS

SparkleShare - Sharing work made easy - 3 views

  •  
    alternative to Dropbox that is fully open source and installable on own servers - looks like a nice tool for us ...
  •  
    .. and it's based on GIT! (the "proper geeky alternative to Subversion" thingie we are using to manage PaGMO's source code)
Francesco Biscani

Open Source Software Meets Do-It-Yourself Biology - 1 views

  •  
    "As the shift to open source software continues, computational biology will become even more accessible, and even more powerful, while intellectual property and other bureaucracies continue to hobble traditional forms of research." Spot on, I'm very glad to see this happening :)
Dario Izzo

JHelioviewer - 7 views

  •  
    OPen SOurce developed entirely in ESA
LeopoldS

NASA - Open Source Summit 2011 - 1 views

  •  
    can't come at a better time with respect to SOCIS ....
Luís F. Simões

Shell energy scenarios to 2050 - 6 views

  •  
    just in case you were feeling happy and optimistic
  • ...7 more comments...
  •  
    An energy scenario published by an oil company? Allow me to be sceptical...
  •  
    Indeed, Shell is an energy company, not just oil, for some time now ... The two scenarii are, in their approach, dependant of economic and political situation, which is right now impossible to forecast. Reference to Kyoto is surprising, almost out-dated! But overall, I find it rather optimistic at some stages, and probably the timeline (p37-39) is unlikely with recent events.
  •  
    the report was published in 2008, which explains the reference to Kyoto, as the follow-up to it was much more uncertain at that point. The Blueprint scenario is indeed optimistic, but also quite unlikely I'd say. I don't see humanity suddenly becoming so wise and coordinated. Sadly, I see something closer to the Scramble scenario as much more likely to occur.
  •  
    not an oil company??? please have a look at the percentage of their revenues coming from oil and gas and then compare this with all their other energy activities together and you will see very quickly that it is only window dressing ... they are an oil and gas company ... and nothing more
  •  
    not JUST oil. From a description: "Shell is a global group of energy and petrochemical companies." Of course revenues coming from oil are the biggest, the investment turnover on other energy sources is small for now. Knowing that most of their revenues is from an expendable source, to guarantee their future, they invest elsewhere. They have invested >1b$ in renewable energy, including biofuels. They had the largest wind power business among so-called "oil" companies. Oil only defines what they do "best". As a comparison, some time ago, Apple were selling only computers and now they sell phones. But I would not say Apple is just a phone company.
  •  
    window dressing only ... e.g. Net cash from operating activities (pre-tax) in 2008: 70 Billion$ net income in 2008: 26 Billion revenues in 2008: 88 Billion Their investments and revenues in renewables don't even show up in their annual financial reports since probably they are under the heading of "marketing" which is already 1.7 Billion $ ... this is what they report on their investments: Capital investment, portfolio actions and business development Capital investment in 2009 was $24 billion. This represents a 26% decrease from 2008, which included over $8 billion in acquisitions, primarily relating to Duvernay Oil Corp. Capital investment included exploration expenditure of $4.5 billion (2008: $11.0 billion). In Abu Dhabi, Shell signed an agreement with Abu Dhabi National Oil Company to extend the GASCO joint venture for a further 20 years. In Australia, Shell and its partners took the final investment decision (FID) for the Gorgon LNG project (Shell share 25%). Gorgon will supply global gas markets to at least 2050, with a capacity of 15 million tonnes (100% basis) of LNG per year and a major carbon capture and storage scheme. Shell has announced a front-end engineering and design study for a floating LNG (FLNG) project, with the potential to deploy these facilities at the Prelude offshore gas discovery in Australia (Shell share 100%). In Australia, Shell confirmed that it has accepted Woodside Petroleum Ltd.'s entitlement offer of new shares at a total cost of $0.8 billion, maintaining its 34.27% share in the company; $0.4 billion was paid in 2009 with the remainder paid in 2010. In Bolivia and Brazil, Shell sold its share in a gas pipeline and in a thermoelectric power plant and its related assets for a total of around $100 million. In Canada, the Government of Alberta and the national government jointly announced their intent to contribute $0.8 billion of funding towards the Quest carbon capture and sequestration project. Quest, which is at the f
  •  
    thanks for the info :) They still have their 50% share in the wind farm in Noordzee (you can see it from ESTEC on a clear day). Look for Shell International Renewables, other subsidiaries and joint-ventures. I guess, the report is about the oil branch. http://sustainabilityreport.shell.com/2009/servicepages/downloads/files/all_shell_sr09.pdf http://www.noordzeewind.nl/
  •  
    no - its about Shell globally - all Shell .. these participations are just peanuts please read the intro of the CEO in the pdf you linked to: he does not even mention renewables! their entire sustainability strategy is about oil and gas - just making it (look) nicer and environmentally friendlier
  •  
    Fair enough, for me even peanuts are worthy and I am not able to judge. Not all big-profit companies, like Shell, are evil :( Look in the pdf what is in the upstream and downstream you mentionned above. Non-shell sources for examples and more objectivity: http://www.nuon.com/company/Innovative-projects/noordzeewind.jsp http://www.e-energymarket.com/news/single-news/article/ferrari-tops-bahrain-gp-using-shell-biofuel.html thanks.
Francesco Biscani

What Open Source shares with Science - Khaotic Musings - conz's Blog at ZDNet.co.uk Com... - 0 views

  •  
    Beautiful and inspirational piece.
  •  
    Thinking about it, probably Open Source today is more faithful to the "scientific method" than most science, as far as the communication and sharing of information is concerned. We badly need to get rid of the dictatorship of journals and assorted bullshit like impact factors...
ESA ACT

GMAT - Home - 0 views

  •  
    Open source general mission analysis software. Should we be in as developers?
1 - 20 of 210 Next › Last »
Showing 20 items per page