Skip to main content

Home/ Advanced Concepts Team/ Group items tagged operations

Rss Feed Group items tagged

3More

Operation Socialist: How GCHQ Spies Hacked Belgium's Largest Telco - 4 views

  •  
    interesting story with many juicy details on how they proceed ... (similarly interesting nickname for the "operation" chosen by our british friends) "The spies used the IP addresses they had associated with the engineers as search terms to sift through their surveillance troves, and were quickly able to find what they needed to confirm the employees' identities and target them individually with malware. The confirmation came in the form of Google, Yahoo, and LinkedIn "cookies," tiny unique files that are automatically placed on computers to identify and sometimes track people browsing the Internet, often for advertising purposes. GCHQ maintains a huge repository named MUTANT BROTH that stores billions of these intercepted cookies, which it uses to correlate with IP addresses to determine the identity of a person. GCHQ refers to cookies internally as "target detection identifiers." Top-secret GCHQ documents name three male Belgacom engineers who were identified as targets to attack. The Intercept has confirmed the identities of the men, and contacted each of them prior to the publication of this story; all three declined comment and requested that their identities not be disclosed. GCHQ monitored the browsing habits of the engineers, and geared up to enter the most important and sensitive phase of the secret operation. The agency planned to perform a so-called "Quantum Insert" attack, which involves redirecting people targeted for surveillance to a malicious website that infects their computers with malware at a lightning pace. In this case, the documents indicate that GCHQ set up a malicious page that looked like LinkedIn to trick the Belgacom engineers. (The NSA also uses Quantum Inserts to target people, as The Intercept has previously reported.) A GCHQ document reviewing operations conducted between January and March 2011 noted that the hack on Belgacom was successful, and stated that the agency had obtained access to the company's
  •  
    I knew I wasn't using TOR often enough...
  •  
    Cool! It seems that after all it is best to restrict employees' internet access only to work-critical areas... @Paul TOR works on network level, so it would not help here much as cookies (application level) were exploited.
10More

Shell energy scenarios to 2050 - 6 views

  •  
    just in case you were feeling happy and optimistic
  • ...7 more comments...
  •  
    An energy scenario published by an oil company? Allow me to be sceptical...
  •  
    Indeed, Shell is an energy company, not just oil, for some time now ... The two scenarii are, in their approach, dependant of economic and political situation, which is right now impossible to forecast. Reference to Kyoto is surprising, almost out-dated! But overall, I find it rather optimistic at some stages, and probably the timeline (p37-39) is unlikely with recent events.
  •  
    the report was published in 2008, which explains the reference to Kyoto, as the follow-up to it was much more uncertain at that point. The Blueprint scenario is indeed optimistic, but also quite unlikely I'd say. I don't see humanity suddenly becoming so wise and coordinated. Sadly, I see something closer to the Scramble scenario as much more likely to occur.
  •  
    not an oil company??? please have a look at the percentage of their revenues coming from oil and gas and then compare this with all their other energy activities together and you will see very quickly that it is only window dressing ... they are an oil and gas company ... and nothing more
  •  
    not JUST oil. From a description: "Shell is a global group of energy and petrochemical companies." Of course revenues coming from oil are the biggest, the investment turnover on other energy sources is small for now. Knowing that most of their revenues is from an expendable source, to guarantee their future, they invest elsewhere. They have invested >1b$ in renewable energy, including biofuels. They had the largest wind power business among so-called "oil" companies. Oil only defines what they do "best". As a comparison, some time ago, Apple were selling only computers and now they sell phones. But I would not say Apple is just a phone company.
  •  
    window dressing only ... e.g. Net cash from operating activities (pre-tax) in 2008: 70 Billion$ net income in 2008: 26 Billion revenues in 2008: 88 Billion Their investments and revenues in renewables don't even show up in their annual financial reports since probably they are under the heading of "marketing" which is already 1.7 Billion $ ... this is what they report on their investments: Capital investment, portfolio actions and business development Capital investment in 2009 was $24 billion. This represents a 26% decrease from 2008, which included over $8 billion in acquisitions, primarily relating to Duvernay Oil Corp. Capital investment included exploration expenditure of $4.5 billion (2008: $11.0 billion). In Abu Dhabi, Shell signed an agreement with Abu Dhabi National Oil Company to extend the GASCO joint venture for a further 20 years. In Australia, Shell and its partners took the final investment decision (FID) for the Gorgon LNG project (Shell share 25%). Gorgon will supply global gas markets to at least 2050, with a capacity of 15 million tonnes (100% basis) of LNG per year and a major carbon capture and storage scheme. Shell has announced a front-end engineering and design study for a floating LNG (FLNG) project, with the potential to deploy these facilities at the Prelude offshore gas discovery in Australia (Shell share 100%). In Australia, Shell confirmed that it has accepted Woodside Petroleum Ltd.'s entitlement offer of new shares at a total cost of $0.8 billion, maintaining its 34.27% share in the company; $0.4 billion was paid in 2009 with the remainder paid in 2010. In Bolivia and Brazil, Shell sold its share in a gas pipeline and in a thermoelectric power plant and its related assets for a total of around $100 million. In Canada, the Government of Alberta and the national government jointly announced their intent to contribute $0.8 billion of funding towards the Quest carbon capture and sequestration project. Quest, which is at the f
  •  
    thanks for the info :) They still have their 50% share in the wind farm in Noordzee (you can see it from ESTEC on a clear day). Look for Shell International Renewables, other subsidiaries and joint-ventures. I guess, the report is about the oil branch. http://sustainabilityreport.shell.com/2009/servicepages/downloads/files/all_shell_sr09.pdf http://www.noordzeewind.nl/
  •  
    no - its about Shell globally - all Shell .. these participations are just peanuts please read the intro of the CEO in the pdf you linked to: he does not even mention renewables! their entire sustainability strategy is about oil and gas - just making it (look) nicer and environmentally friendlier
  •  
    Fair enough, for me even peanuts are worthy and I am not able to judge. Not all big-profit companies, like Shell, are evil :( Look in the pdf what is in the upstream and downstream you mentionned above. Non-shell sources for examples and more objectivity: http://www.nuon.com/company/Innovative-projects/noordzeewind.jsp http://www.e-energymarket.com/news/single-news/article/ferrari-tops-bahrain-gp-using-shell-biofuel.html thanks.
1More

Johnson Electro Mechanical Systems - 0 views

  •  
    he JTEC is an all solid-state engine that operates on the Ericsson cycle. Equivalent to Carnot, the Ericsson cycle offers the maximum theoretical efficiency available from an engine operating between two temperatures.
4More

The Nanodevice Aiming to Replace the Field Effect Transistor - 2 views

  •  
    very nice! "For a start, the wires operate well as switches that by some measures compare well to field effect transistors. For example they allow a million times more current to flow when they are on compared with off when operating at a voltage of about 1.5 V. "[A light effect transistor] can replicate the basic switching function of the modern field effect transistor with competitive (and potentially improved) characteristics," say Marmon and co. But they wires also have entirely new capabilities. The device works as an optical amplifier and can also perform basic logic operations by using two or more laser beams rather than one. That's something a single field effect transistor cannot do."
  • ...1 more comment...
  •  
    The good thing about using CdSe NW (used here) is that they show a photon-to-current efficiency window around the visible wavelengths, therefore any visible light can in principle be used in this application to switch the transistor on/off. I don't agree with the moto "Nanowires are also simpler than field effect transistors and so they're potentially cheaper and easier to make." Yes, they are simple, yet for applications, fabricating devices with them consistently is very challenging (being the research effort not cheap at all..) and asks for improvements and breakthroughs in the fabrication process.
  •  
    any idea how the shine the light selectively to such small surfaces?
  •  
    "Illumination sources consisted of halogen light, 532.016, 441.6, and 325 nm lasers ported through a Horiba LabRAM HR800 confocal Raman system with an internal 632.8 nm laser. Due to limited probe spacing for electrical measurements, all illumination sources were focused through a 50x long working distance (LWD) objective lens (N.A. = 0.50), except 325 nm, which went through a 10x MPLAN objective lens (N.A. = 0.25)." Laser spot size calculated from optical diffraction formula 1.22*lambda/NA
2More

First circuit breaker for high voltage direct current - 2 views

  •  
    Doesn't really sound sexy, but this is of utmost importance for next generation grids for renewable energy.
  •  
    I agree on the significance indeed - a small boost also for my favourite Desertec project ... Though their language is a bit too "grandiose": "ABB has successfully designed and developed a hybrid DC breaker after years of research, functional testing and simulation in the R&D laboratories. This breaker is a breakthrough that solves a technical challenge that has been unresolved for over a hundred years and was perhaps one the main influencers in the 'war of currents' outcome. The 'hybrid' breaker combines mechanical and power electronics switching that enables it to interrupt power flows equivalent to the output of a nuclear power station within 5 milliseconds - that's as fast as a honey bee takes per flap of its wing - and more than 30 times faster than the reaction time of an Olympic 100-meter medalist to react to the starter's gun! But its not just about speed. The challenge was to do it 'ultra-fast' with minimal operational losses and this has been achieved by combining advanced ultrafast mechanical actuators with our inhouse semiconductor IGBT valve technologies or power electronics (watch video: Hybrid HVDC Breaker - How does it work). In terms of significance, this breaker is a 'game changer'. It removes a significant stumbling block in the development of HVDC transmission grids where planning can start now. These grids will enable interconnection and load balancing between HVDC power superhighways integrating renewables and transporting bulk power across long distances with minimal losses. DC grids will enable sharing of resources like lines and converter stations that provides reliability and redundancy in a power network in an economically viable manner with minimal losses. ABB's new Hybrid HVDC breaker, in simple terms will enable the transmission system to maintain power flow even if there is a fault on one of the lines. This is a major achievement for the global R&D team in ABB who have worked for years on the challeng
1More

Government Lab Reveals It Has Operated Quantum Internet For Over Two Years | MIT Techno... - 0 views

  •  
    They already surfing in the quantum web...
6More

Everything You Wanted to Know about Space Tourism but Were Afraid to Ask | Space Safety... - 3 views

  •  
    "chances are that if 700 passengers are flown annually, up to 10 of them might not survive the flight in the first years of the operations." most remarkable also the question who is to blame if a dead and burned space tourist corps comes crashing down from the sky into your car.
  • ...3 more comments...
  •  
    How sure is the information that a human body would not completely burn / ablate during atmospheric re-entry? I am not aware of any material ground tests in a plasma wind tunnel confirming that human tissue would survive re-entry from LEO.
  •  
    Since a steak would not even be cooked by dropping it from very high altitudes (http://what-if.xkcd.com/28/) I would doubt that a space tourists body would desintegrate by atmospheric re-entry.
  •  
    Funny link, however, some things are not clear enough: 1. Ablation rate is unknown 2. What are the entry conditions? The link suggests that the steak is just dropped (no initial velocity). 3. What about the ballistic coefficient? 4. How would the entry body orientation? It would be a quite non-steady state configuration I guess with heavy accelerations. 5. How would vacuum exposure impact on the water in the body/steak and what would be the consequence for ablation behaviour? 6. Does surface chemistry play a role (not ablation, but catalysis)? My conclusion: the example with the steak is a funny and not so bad exercise, not more.
  •  
    This calls for some we serious simulations by the Petkow code it seems to me ...
  •  
    I still would need some serious input data...
2More

Four-wheel nanocar takes to the road - physicsworld.com - 1 views

  •  
    A "four-wheel drive car" less than one billionth the length of an average SUV has been built and operated by researchers in the Netherlands and Switzerland.
  •  
    "Molecular machines are common in nature. Motor proteins, for example, can move along a surface to transport molecular-sized cargo and are often used to build structures within living cells. " reminds me of the fantastic movie on what happens inside a cell ...
2More

Cell phones are 'Stalin's dream,' says free software movement founder - 3 views

  •  
    "I don't have a cell phone. I won't carry a cell phone," says Stallman, founder of the free software movement and creator of the GNU operating system. "It's Stalin's dream. Cell phones are tools of Big Brother. I'm not going to carry a tracking device that records where I go all the time, and I'm not going to carry a surveillance device that can be turned on to eavesdrop." he is right once more ...
  •  
    I am going to live in the forest! Sadly, while true, there's no way around it these days. On the up-side the information overflow these days exceeds processing speeds. Soon it will become increasingly difficult for NSA or other organizations to find anything in the tons of data they stash away. Like some guy said in a random youtube video I can't find now anymore: "good luck trying to find my personal data when I'm tagged in 5000 pictures of cats!"
1More

Why starting from differential equations for computational physics? - 1 views

  •  
    "The computational methods currently used in physics are based on the discretization of differential equations. This is because the computer can only perform algebraic operations. The purpose of this paper is to critically review this practice, showing how to obtain a purely algebraic formulation of physical laws starting directly from experimental measurements."
17More

Massively collaborative mathematics : Article : Nature - 28 views

  •  
    peer-to-peer theorem-proving
  • ...14 more comments...
  •  
    Or: mathematicians catch up with open-source software developers :)
  •  
    "Similar open-source techniques could be applied in fields such as [...] computer science, where the raw materials are informational and can be freely shared online." ... or we could reach the point, unthinkable only few years ago, of being able to exchange text messages in almost real time! OMG, think of the possibilities! Seriously, does the author even browse the internet?
  •  
    I do not agree with you F., you are citing out of context! Sharing messages does not make a collaboration, nor does a forum, .... You need a set of rules and a common objective. This is clearly observable in "some team", where these rules are lacking, making team work inexistent. The additional difficulties here are that it involves people that are almost strangers to each other, and the immateriality of the project. The support they are using (web, wiki) is only secondary. What they achieved is remarkable, disregarding the subject!
  •  
    I think we will just have to agree to disagree then :) Open source developers have been organizing themselves with emails since the early '90s, and most projects (e.g., the Linux kernel) still do not use anything else today. The Linux kernel mailing list gets around 400 messages per day, and they are managing just fine to scale as the number of contributors increases. I agree that what they achieved is remarkable, but it is more for "what" they achieved than "how". What they did does not remotely qualify as "massively" collaborative: again, many open source projects are managed collaboratively by thousands of people, and many of them are in the multi-million lines of code range. My personal opinion of why in the scientific world these open models are having so many difficulties is that the scientific community today is (globally, of course there are many exceptions) a closed, mostly conservative circle of people who are scared of changes. There is also the fact that the barrier of entry in a scientific community is very high, but I think that this should merely scale down the number of people involved and not change the community "qualitatively". I do not think that many research activities are so much more difficult than, e.g., writing an O(1) scheduler for an Operating System or writing a new balancing tree algorithm for efficiently storing files on a filesystem. Then there is the whole issue of scientific publishing, which, in its current form, is nothing more than a racket. No wonder traditional journals are scared to death by these open-science movements.
  •  
    here we go ... nice controversy! but maybe too many things mixed up together - open science journals vs traditional journals, conservatism of science community wrt programmers (to me one of the reasons for this might be the average age of both groups, which is probably more than 10 years apart ...) and then using emailing wrt other collaboration tools .... .... will have to look at the paper now more carefully ... (I am surprised to see no comment from José or Marek here :-)
  •  
    My point about your initial comment is that it is simplistic to infer that emails imply collaborative work. You actually use the word "organize", what does it mean indeed. In the case of Linux, what makes the project work is the rules they set and the management style (hierachy, meritocracy, review). Mailing is just a coordination mean. In collaborations and team work, it is about rules, not only about the technology you use to potentially collaborate. Otherwise, all projects would be successful, and we would noy learn management at school! They did not write they managed the colloboration exclusively because of wikipedia and emails (or other 2.0 technology)! You are missing the part that makes it successful and remarkable as a project. On his blog the guy put a list of 12 rules for this project. None are related to emails, wikipedia, forums ... because that would be lame and your comment would make sense. Following your argumentation, the tools would be sufficient for collaboration. In the ACT, we have plenty of tools, but no team work. QED
  •  
    the question on the ACT team work is one that is coming back continuously and it always so far has boiled down to the question of how much there need and should be a team project to which everybody inthe team contributes in his / her way or how much we should leave smaller, flexible teams within the team form and progress, more following a bottom-up initiative than imposing one from top-down. At this very moment, there are at least 4 to 5 teams with their own tools and mechanisms which are active and operating within the team. - but hey, if there is a real will for one larger project of the team to which all or most members want to contribute, lets go for it .... but in my view, it should be on a convince rather than oblige basis ...
  •  
    It is, though, indicative that some of the team member do not see all the collaboration and team work happening around them. We always leave the small and agile sub-teams to form and organize themselves spontaneously, but clearly this method leaves out some people (be it for their own personal attitude or be it for pure chance) For those cases which we could think to provide the possibility to participate in an alternative, more structured, team work where we actually manage the hierachy, meritocracy and perform the project review (to use Joris words).
  •  
    I am, and was, involved in "collaboration" but I can say from experience that we are mostly a sum of individuals. In the end, it is always one or two individuals doing the job, and other waiting. Sometimes even, some people don't do what they are supposed to do, so nothing happens ... this could not be defined as team work. Don't get me wrong, this is the dynamic of the team and I am OK with it ... in the end it is less work for me :) team = 3 members or more. I am personally not looking for a 15 member team work, and it is not what I meant. Anyway, this is not exactly the subject of the paper.
  •  
    My opinion about this is that a research team, like the ACT, is a group of _people_ and not only brains. What I mean is that people have feelings, hate, anger, envy, sympathy, love, etc about the others. Unfortunately(?), this could lead to situations, where, in theory, a group of brains could work together, but not the same group of people. As far as I am concerned, this happened many times during my ACT period. And this is happening now with me in Delft, where I have the chance to be in an even more international group than the ACT. I do efficient collaborations with those people who are "close" to me not only in scientific interest, but also in some private sense. And I have people around me who have interesting topics and they might need my help and knowledge, but somehow, it just does not work. Simply lack of sympathy. You know what I mean, don't you? About the article: there is nothing new, indeed. However, why it worked: only brains and not the people worked together on a very specific problem. Plus maybe they were motivated by the idea of e-collaboration. No revolution.
  •  
    Joris, maybe I made myself not clear enough, but my point was only tangentially related to the tools. Indeed, it is the original article mention of "development of new online tools" which prompted my reply about emails. Let me try to say it more clearly: my point is that what they accomplished is nothing new methodologically (i.e., online collaboration of a loosely knit group of people), it is something that has been done countless times before. Do you think that now that it is mathematicians who are doing it makes it somehow special or different? Personally, I don't. You should come over to some mailing lists of mathematical open-source software (e.g., SAGE, Pari, ...), there's plenty of online collaborative research going on there :) I also disagree that, as you say, "in the case of Linux, what makes the project work is the rules they set and the management style (hierachy, meritocracy, review)". First of all I think the main engine of any collaboration like this is the objective, i.e., wanting to get something done. Rules emerge from self-organization later on, and they may be completely different from project to project, ranging from almost anarchy to BDFL (benevolent dictator for life) style. Given this kind of variety that can be observed in open-source projects today, I am very skeptical that any kind of management rule can be said to be universal (and I am pretty sure that the overwhelming majority of project organizers never went to any "management school"). Then there is the social aspect that Tamas mentions above. From my personal experience, communities that put technical merit above everything else tend to remain very small and generally become irrelevant. The ability to work and collaborate with others is the main asset the a participant of a community can bring. I've seen many times on the Linux kernel mailing list contributions deemed "technically superior" being disregarded and not considered for inclusion in the kernel because it was clear that
  •  
    hey, just catched up the discussion. For me what is very new is mainly the framework where this collaborative (open) work is applied. I haven't seen this kind of working openly in any other field of academic research (except for the Boinc type project which are very different, because relying on non specialists for the work to be done). This raise several problems, and mainly the one of the credit, which has not really been solved as I read in the wiki (is an article is written, who writes it, what are the names on the paper). They chose to refer to the project, and not to the individual researchers, as a temporary solution... It is not so surprising for me that this type of work has been first done in the domain of mathematics. Perhaps I have an ideal view of this community but it seems that the result obtained is more important than who obtained it... In many areas of research this is not the case, and one reason is how the research is financed. To obtain money you need to have (scientific) credit, and to have credit you need to have papers with your name on it... so this model of research does not fit in my opinion with the way research is governed. Anyway we had a discussion on the Ariadnet on how to use it, and one idea was to do this kind of collaborative research; idea that was quickly abandoned...
  •  
    I don't really see much the problem with giving credit. It is not the first time a group of researchers collectively take credit for a result under a group umbrella, e.g., see Nicolas Bourbaki: http://en.wikipedia.org/wiki/Bourbaki Again, if the research process is completely transparent and publicly accessible there's no way to fake contributions or to give undue credit, and one could cite without problems a group paper in his/her CV, research grant application, etc.
  •  
    Well my point was more that it could be a problem with how the actual system works. Let say you want a grant or a position, then the jury will count the number of papers with you as a first author, and the other papers (at least in France)... and look at the impact factor of these journals. Then you would have to set up a rule for classifying the authors (endless and pointless discussions), and give an impact factor to the group...?
  •  
    it seems that i should visit you guys at estec... :-)
  •  
    urgently!! btw: we will have the ACT christmas dinner on the 9th in the evening ... are you coming?
2More

NASA Goddard to Auction off Patents for Automated Software Code Generation - 0 views

  • The technology was originally developed to handle coding of control code for spacecraft swarms, but it is broadly applicable to any commercial application where rule-based systems development is used.
  •  
    This is related to the "Verified Software" item in NewScientist's list of ideas that will change science. At the link below you'll find the text of the patents being auctioned: http://icapoceantomo.com/item-for-sale/exclusive-license-related-improved-methodology-formally-developing-control-systems :) Patent #7,627,538 ("Swarm autonomic agents with self-destruct capability") makes for quite an interesting read: "This invention relates generally to artificial intelligence and, more particularly, to architecture for collective interactions between autonomous entities." "In some embodiments, an evolvable synthetic neural system is operably coupled to one or more evolvable synthetic neural systems in a hierarchy." "In yet another aspect, an autonomous nanotechnology swarm may comprise a plurality of workers composed of self-similar autonomic components that are arranged to perform individual tasks in furtherance of a desired objective." "In still yet another aspect, a process to construct an environment to satisfy increasingly demanding external requirements may include instantiating an embryonic evolvable neural interface and evolving the embryonic evolvable neural interface towards complex complete connectivity." "In some embodiments, NBF 500 also includes genetic algorithms (GA) 504 at each interface between autonomic components. The GAs 504 may modify the intra-ENI 202 to satisfy requirements of the SALs 502 during learning, task execution or impairment of other subsystems."
2More

The Olympics run on Windows (XP) | Beyond Binary - CNET News - 3 views

  •  
    The good news for Microsoft is that all the PCs powering the Olympics are running Windows. The bad news: it's the older Windows XP operating system.
  •  
    Now I start to understand why the Swiss win so many medals. That's most probably a bug!!!
14More

STLport: An Interview with A. Stepanov - 2 views

  • Generic programming is a programming method that is based in finding the most abstract representations of efficient algorithms.
  • I spent several months programming in Java.
  • for the first time in my life programming in a new language did not bring me new insights
  • ...2 more annotations...
  • it has no intellectual value whatsoever
  • Java is clearly an example of a money oriented programming (MOP).
  •  
    One of the authors of the STL (C++'s Standard Template Library) explains generic programming and slams Java.
  • ...6 more comments...
  •  
    "Java is clearly an example of a money oriented programming (MOP)." Exactly. And for the industry it's the money that matters. Whatever mathematicians think about it.
  •  
    It is actually a good thing that it is "MOP" (even though I do not agree with this term): that is what makes it inter-operable, light and easy to learn. There is no point in writing fancy codes, if it does not bring anything to the end-user, but only for geeks to discuss incomprehensible things in forums. Anyway, I am pretty sure we can find a Java guy slamming C++ ;)
  •  
    Personally, I never understood what the point of Java is, given that: 1) I do not know of any developer (maybe Marek?) that uses it for intellectual pleasure/curiosity/fun whatever, given the possibility of choice - this to me speaks loudly on the objective qualities of the language more than any industrial-corporate marketing bullshit (for the record, I argue that Python is more interoperable, lighter and easier to learn than Java - which is why, e.g., Google is using it heavily); 2) I have used a software developed in Java maybe a total of 5 times on any computer/laptop I owned over 15 years. I cannot name of one single Java project that I find necessary or even useful; for my usage of computers, Java could disappear overnight without even noticing. Then of course one can argue as much as one wants about the "industry choosing Java", to which I would counterargue with examples of industry doing stupid things and making absurd choices. But I suppose it would be a kind of pointless discussion, so I'll just stop here :)
  •  
    "At Google, python is one of the 3 "official languages" alongside with C++ and Java". Java runs everywhere (the byte code itself) that is I think the only reason it became famous. Python, I guess, is more heavy if it were to run on your web browser! I think every language has its pros and cons, but I agree Java is not the answer to everything... Java is used in MATLAB, some web applications, mobile phones apps, ... I would be a bit in trouble if it were to disappear today :(
  •  
    I personally do not believe in interoperability :)
  •  
    Well, I bet you'd notice an overnight disappearance of java, because half of the internet would vanish... J2EE technologies are just omnipresent there... I'd rather not even *think* about developing a web application/webservice/web-whatever in standard C++... is it actually possible?? Perhaps with some weird Microsoft solutions... I bet your bank online services are written in Java. Certainly not in PHP+MySQL :) Industry has chosen Java not because of industrial-corporate marketing bullshit, but because of economics... it enables you develop robustly, reliably, error-prone, modular, well integrated etc... software. And the costs? Well, using java technologies you can set-up enterprise-quality web application servers, get a fully featured development environment (which is better than ANY C/C++/whatever development environment I've EVER seen) at the cost of exactly 0 (zero!) USD/GBP/EUR... Since many years now, the central issue in software development is not implementing algorithms, it's building applications. And that's where Java outperforms many other technologies. The final remark, because I may be mistakenly taken for an apostle of Java or something... I love the idea of generic programming, C++ is my favourite programming language (and I used to read Stroustroup before sleep), at leisure time I write programs in Python... But if I were to start a software development company, then, apart from some very niche applications like computer games, it most probably would use Java as main technology.
  •  
    "I'd rather not even *think* about developing a web application/webservice/web-whatever in standard C++... is it actually possible?? Perhaps with some weird Microsoft solutions... I bet your bank online services are written in Java. Certainly not in PHP+MySQL :)" Doing in C++ would be awesomely crazy, I agree :) But as I see it there are lots of huge websites that operate on PHP, see for instance Facebook. For the banks and the enterprise market, as a general rule I tend to take with a grain of salt whatever spin comes out from them; in the end behind every corporate IT decision there is a little smurf just trying to survive and have the back covered :) As they used to say in the old times, "No one ever got fired for buying IBM". "Industry has chosen Java not because of industrial-corporate marketing bullshit, but because of economics... it enables you develop robustly, reliably, error-prone, modular, well integrated etc... software. And the costs? Well, using java technologies you can set-up enterprise-quality web application servers, get a fully featured development environment (which is better than ANY C/C++/whatever development environment I've EVER seen) at the cost of exactly 0 (zero!) USD/GBP/EUR... Since many years now, the central issue in software development is not implementing algorithms, it's building applications. And that's where Java outperforms many other technologies." Apart from the IDE considerations (on which I cannot comment, since I'm not a IDE user myself), I do not see how Java beats the competition in this regard (again, Python and the huge software ecosystem surrounding it). My impression is that Java's success is mostly due to Sun pushing it like there is no tomorrow and bundling it with their hardware business.
  •  
    OK, I think there is a bit of everything, wrong and right, but you have to acknowledge that Python is not always the simplest. For info, Facebook uses Java (if you upload picture for instance), and PHP is very limited. So definitely, in company, engineers like you and me select the language, it is not a marketing or political thing. And in the case of fb, they come up with the conclusion that PHP, and Java don't do everything but complement each other. As you say Python as many things around, but it might be too much for simple applications. Otherwise, I would seriously be interested by a study of how to implement a Python-like system on-board spacecrafts and what are the advantages over mixing C, Ada and Java.
1More

Basics of Space Flight - 0 views

  •  
    Basics of Space Flight is a tutorial designed primarily to help operations people identify the range of concepts associated with deep space missions, and grasp the relationships among them.
1More

Why Sleep? | Physical Review Focus - 0 views

shared by ESA ACT on 24 Apr 09 - Cached
  •  
    A study in the January Physical Review E suggests that a sleep-wake cycle, allowing the brain to focus on one task at a time, may be the most efficient way to operate. The researcher shows mathematically that processing a continuously changing resource--s
1More

SUSE Studio - 0 views

shared by ESA ACT on 24 Apr 09 - Cached
  •  
    Create a tuned server appliance, containing your application and just enough operating system components
1More

gOS - Discover a good OS. - 0 views

  •  
    An operative system where instead of local applications one finds links to internet applications. Light and easy to use.
5More

Google's Go: A New Programming Language That's Python Meets C++ - 6 views

  •  
    Big news for developers out there: Google has just announced the release of a new, open sourced programming language called Go. The company ...
  • ...2 more comments...
  •  
    Ugh... no operator overloading, no efficient generic programming and no lambda expressions... Only time will tell, but I don't understand who the intended audience is: I think that Python guys won't care about the (supposedly) increased performance (and you can interface C/C++ with Python easily) and that C++ programmers (I mean, the hardcore serious C++ Boost-like programmers, no the Java-like whiners :P) won't have their beloved templates pried from their cold dead hands with ease.
  •  
    yeah though I think especially operator overloading is not going to be a main problem, it is as with the JS library though quite thinkable that lots of users will switch or use it (or being put to use it...) because it is done by Google
  •  
    Having Google backing it will certainly help, even though they are presenting it as a "system level" (i.e., hard-core) language, and in that domain it is much more difficult to bullshit your way to a position of relevance. Look at Java: Sun pushed it like hell and it is certainly widely used in many contexts (corporate, web and embedded markets mostly), yet it completely failed to win the hearts of "open-source" developers (or, more generally, of those developers who are not forced to use it by virtue of some management-driven decision).
  •  
    "or, more generally, of those developers who are not forced to use it by virtue of some management-driven decision" completely agree with that!!
1More

physicists explain what AI researchers are actually doing - 5 views

  •  
    love this one ... it seems to take physicist to explain to the AI crowd what they are actually doing ... Deep learning is a broad set of techniques that uses multiple layers of representation to automatically learn relevant features directly from structured data. Recently, such techniques have yielded record-breaking results on a diverse set of difficult machine learning tasks in computer vision, speech recognition, and natural language processing. Despite the enormous success of deep learning, relatively little is understood theoretically about why these techniques are so successful at feature learning and compression. Here, we show that deep learning is intimately related to one of the most important and successful techniques in theoretical physics, the renormalization group (RG). RG is an iterative coarse-graining scheme that allows for the extraction of relevant features (i.e. operators) as a physical system is examined at different length scales. We construct an exact mapping from the variational renormalization group, first introduced by Kadanoff, and deep learning architectures based on Restricted Boltzmann Machines (RBMs). We illustrate these ideas using the nearest-neighbor Ising Model in one and two-dimensions. Our results suggests that deep learning algorithms may be employing a generalized RG-like scheme to learn relevant features from data.
1 - 20 of 79 Next › Last »
Showing 20 items per page