Skip to main content

Home/ Advanced Concepts Team/ Group items tagged additive

Rss Feed Group items tagged

santecarloni

Light bends itself round corners - physicsworld.com - 1 views

  •  
    The Florida team generated a specially shaped laser beam that could self-accelerate, or bend, sideways.
  •  
    very nice!!! read this e.g. "In addition to this self-bending, the beam's intensity pattern also has a couple of other intriguing characteristics. One is that it is non-diffracting, which means that the width of each intensity region does not appreciably increase as the beam travels forwards. This is unlike a normal beam - even a tightly collimated laser beam - which spreads as it propagates. The other unusual property is that of self-healing. This means that if part of the beam is blocked by opaque objects, then any disruptions to the beam's intensity pattern could gradually recover as the beam travels forward."
Joris _

NASA International Space Station Longeron Marathon Challenge - 1 views

shared by Joris _ on 18 Jan 13 - No Cached
LeopoldS liked it
  •  
    nice - did not know about it. GTOC on steroids and with loads of cash. concerning this specific challenge and especially the last condition: doesn't this hint towards a flawed design? In addition to maximizing the total power output there are some constraints on the possible movements: Each SARJ and BGA is limited to a maximum angular velocity and to a maximum angular acceleration. Each SAW must produce at least some minimum average power over the orbit (which is different for each SAW). The sequence of positions must be cyclic, so it can be repeated on the next orbit. The maximum amount of BGA rotation is not limited, but exceeding a threshold will result in a score penalty. Some structural members of the SAW mast (called Longerons) have restrictions on how they can be shadowed.
  •  
    The longerons will expand and contract with exposition to sun (I think whatever the material they are made of). Because you have 4 longerons in a mast, you just need to be carefull that the mast is well balanced, and that the 4 longerons support each other, basically, you need an even number of shadowed longerons, possibly 0 too. I would call this an operational constraint.
jcunha

3D printed sonic tractor beam - 2 views

  •  
    A good DIY project - build a tractor beam using 3D printed parts. The 3D files and research are available for download. A new way for microgravity research @fichbio ?
LeopoldS

Culturomics Looks at the Birth and Death of Words - WSJ.com - 0 views

  •  
    very nice work indeed. Here's Slashdot's summary, with additional links: Physicists Discover Evolutionary Laws of Language
  •  
    this is the study I was talking about over lunch ...
Lionel Jacques

Higgs hunters close in on their quarry - 1 views

  •  
    The first solid experimental evidence for the existence of the Higgs boson has been unveiled today by physicists working on the Large Hadron Collider (LHC) at CERN in Geneva. Members of the ATLAS experiment revealed evidence that the Higgs particle has a mass of about 126 GeV/c2. "By 2014/2015 we could have enough additional data to eliminate large classes of theories that attempt to explain the Higgs,"
Daniel Hennes

A.I. XPRIZE - 3 views

  •  
    TED is sponsoring an A.I. XPRIZE. The goal? Develop an artificial intelligence that jumps on stage and gives a 3min talk on a random topic...
  •  
    I am going to propose that the rules include in addition something practical - like washing the dishes... If we are to foster progress, let's finally do so in the right direction...
  •  
    This sort of reminds me of Hinton's paper from some years ago: http://www.cs.utoronto.ca/~ilya/pubs/2011/LANG-RNN.pdf Train it on previous TED talks and let it run TED talk - like gibberish. It would probably be of similar value. He had a nice one on the meaning of life but I can't find it anymore.
Ma Ru

Here come gravitational waves - 3 views

  •  
    Here you go. You can now scrap Lisa altogether. Who's going to tell Pacome?
  •  
    Awesome and exciting stuff indeed! The data pinpoint the time when inflation occurred - about 10E-37 seconds into the Universe's life - and its temperature at the time, corresponding to energies of about 10E16 gigaelectronvolts, says cosmologist Michael Turner of the University of Chicago. That is the same energy at which three of the four fundamental forces of nature - the weak, strong and electromagnetic force - are expected to become indistinguishable from one another in a model known as the grand unified theory. I expect more fundamental physics insights to come out of this in the future. A full-sky survey from space may still be an interesting addition to the measurement capabilities, so I would not rule out LISA all together I guess...
Dario Izzo

IPCC models getting mushy | Financial Post - 2 views

  •  
    why am I not surprised .....
  •  
    http://www.academia.edu/4210419/Can_climate_models_explain_the_recent_stagnation_in_global_warming A view of well-respected scientists on how to proceed from here, that was rejected from Nature. In any case, a long way to go...
  •  
    unfortunately it's too early to cheer and burn more coal ... there is also a nice podcast associated to this paper from nature Recent global-warming hiatus tied to equatorial Pacific surface cooling Yu Kosaka & Shang-Ping Xie Nature 501, 403-407 (19 September 2013) doi:10.1038/nature12534 Received 18 June 2013 Accepted 08 August 2013 Published online 28 August 2013 Despite the continued increase in atmospheric greenhouse gas concentrations, the annual-mean global temperature has not risen in the twenty-first century1, 2, challenging the prevailing view that anthropogenic forcing causes climate warming. Various mechanisms have been proposed for this hiatus in global warming3, 4, 5, 6, but their relative importance has not been quantified, hampering observational estimates of climate sensitivity. Here we show that accounting for recent cooling in the eastern equatorial Pacific reconciles climate simulations and observations. We present a novel method of uncovering mechanisms for global temperature change by prescribing, in addition to radiative forcing, the observed history of sea surface temperature over the central to eastern tropical Pacific in a climate model. Although the surface temperature prescription is limited to only 8.2% of the global surface, our model reproduces the annual-mean global temperature remarkably well with correlation coefficient r = 0.97 for 1970-2012 (which includes the current hiatus and a period of accelerated global warming). Moreover, our simulation captures major seasonal and regional characteristics of the hiatus, including the intensified Walker circulation, the winter cooling in northwestern North America and the prolonged drought in the southern USA. Our results show that the current hiatus is part of natural climate variability, tied specifically to a La-Niña-like decadal cooling. Although similar decadal hiatus events may occur in the future, the multi-decadal warming trend is very likely to continue with greenhouse gas
Tom Gheysens

Scientists discover double meaning in genetic code - 4 views

  •  
    Does this have implications for AI algorithms??
  • ...1 more comment...
  •  
    Somehow, the mere fact does not surprise me. I always assumed that the genetic information is on multiple overlapping layers encoded. I do not see how this can be transferred exactly on genetic algorithms, but a good encoding on them is important and I guess that you could produce interesting effects by "overencoding" of parameters, apart from being more space-efficient.
  •  
    I was actually thinking exactly about this question during my bike ride this morning. I am surprised that some codons would need to have a double meaning though because there is already a surplus of codons to translate into just 20-22 proteins (depending on organism). So there should be about 44 codons left to prevent translation errors and in addition regulate gene expression. If - as the article suggests - a single codon can take a dual role, does it so in different situations (needing some other regulator do discern those)? Or does it just perform two functions that always need to happen simultaneously? I tried to learn more from the underlying paper: https://www.sciencemag.org/content/342/6164/1367.full.pdf All I got from that was a headache. :-\
  •  
    Probably both. Likely a consequence of energy preservation during translation. If you can do the same thing with less genes you save up on the effort required to reproduce. Also I suspect it has something to do with modularity. It makes sense that the gene regulating for "foot" cells also trigger the genes that generate "toe" cells for example. No point in having an extra if statement.
tvinko

Massively collaborative mathematics : Article : Nature - 28 views

  •  
    peer-to-peer theorem-proving
  • ...14 more comments...
  •  
    Or: mathematicians catch up with open-source software developers :)
  •  
    "Similar open-source techniques could be applied in fields such as [...] computer science, where the raw materials are informational and can be freely shared online." ... or we could reach the point, unthinkable only few years ago, of being able to exchange text messages in almost real time! OMG, think of the possibilities! Seriously, does the author even browse the internet?
  •  
    I do not agree with you F., you are citing out of context! Sharing messages does not make a collaboration, nor does a forum, .... You need a set of rules and a common objective. This is clearly observable in "some team", where these rules are lacking, making team work inexistent. The additional difficulties here are that it involves people that are almost strangers to each other, and the immateriality of the project. The support they are using (web, wiki) is only secondary. What they achieved is remarkable, disregarding the subject!
  •  
    I think we will just have to agree to disagree then :) Open source developers have been organizing themselves with emails since the early '90s, and most projects (e.g., the Linux kernel) still do not use anything else today. The Linux kernel mailing list gets around 400 messages per day, and they are managing just fine to scale as the number of contributors increases. I agree that what they achieved is remarkable, but it is more for "what" they achieved than "how". What they did does not remotely qualify as "massively" collaborative: again, many open source projects are managed collaboratively by thousands of people, and many of them are in the multi-million lines of code range. My personal opinion of why in the scientific world these open models are having so many difficulties is that the scientific community today is (globally, of course there are many exceptions) a closed, mostly conservative circle of people who are scared of changes. There is also the fact that the barrier of entry in a scientific community is very high, but I think that this should merely scale down the number of people involved and not change the community "qualitatively". I do not think that many research activities are so much more difficult than, e.g., writing an O(1) scheduler for an Operating System or writing a new balancing tree algorithm for efficiently storing files on a filesystem. Then there is the whole issue of scientific publishing, which, in its current form, is nothing more than a racket. No wonder traditional journals are scared to death by these open-science movements.
  •  
    here we go ... nice controversy! but maybe too many things mixed up together - open science journals vs traditional journals, conservatism of science community wrt programmers (to me one of the reasons for this might be the average age of both groups, which is probably more than 10 years apart ...) and then using emailing wrt other collaboration tools .... .... will have to look at the paper now more carefully ... (I am surprised to see no comment from José or Marek here :-)
  •  
    My point about your initial comment is that it is simplistic to infer that emails imply collaborative work. You actually use the word "organize", what does it mean indeed. In the case of Linux, what makes the project work is the rules they set and the management style (hierachy, meritocracy, review). Mailing is just a coordination mean. In collaborations and team work, it is about rules, not only about the technology you use to potentially collaborate. Otherwise, all projects would be successful, and we would noy learn management at school! They did not write they managed the colloboration exclusively because of wikipedia and emails (or other 2.0 technology)! You are missing the part that makes it successful and remarkable as a project. On his blog the guy put a list of 12 rules for this project. None are related to emails, wikipedia, forums ... because that would be lame and your comment would make sense. Following your argumentation, the tools would be sufficient for collaboration. In the ACT, we have plenty of tools, but no team work. QED
  •  
    the question on the ACT team work is one that is coming back continuously and it always so far has boiled down to the question of how much there need and should be a team project to which everybody inthe team contributes in his / her way or how much we should leave smaller, flexible teams within the team form and progress, more following a bottom-up initiative than imposing one from top-down. At this very moment, there are at least 4 to 5 teams with their own tools and mechanisms which are active and operating within the team. - but hey, if there is a real will for one larger project of the team to which all or most members want to contribute, lets go for it .... but in my view, it should be on a convince rather than oblige basis ...
  •  
    It is, though, indicative that some of the team member do not see all the collaboration and team work happening around them. We always leave the small and agile sub-teams to form and organize themselves spontaneously, but clearly this method leaves out some people (be it for their own personal attitude or be it for pure chance) For those cases which we could think to provide the possibility to participate in an alternative, more structured, team work where we actually manage the hierachy, meritocracy and perform the project review (to use Joris words).
  •  
    I am, and was, involved in "collaboration" but I can say from experience that we are mostly a sum of individuals. In the end, it is always one or two individuals doing the job, and other waiting. Sometimes even, some people don't do what they are supposed to do, so nothing happens ... this could not be defined as team work. Don't get me wrong, this is the dynamic of the team and I am OK with it ... in the end it is less work for me :) team = 3 members or more. I am personally not looking for a 15 member team work, and it is not what I meant. Anyway, this is not exactly the subject of the paper.
  •  
    My opinion about this is that a research team, like the ACT, is a group of _people_ and not only brains. What I mean is that people have feelings, hate, anger, envy, sympathy, love, etc about the others. Unfortunately(?), this could lead to situations, where, in theory, a group of brains could work together, but not the same group of people. As far as I am concerned, this happened many times during my ACT period. And this is happening now with me in Delft, where I have the chance to be in an even more international group than the ACT. I do efficient collaborations with those people who are "close" to me not only in scientific interest, but also in some private sense. And I have people around me who have interesting topics and they might need my help and knowledge, but somehow, it just does not work. Simply lack of sympathy. You know what I mean, don't you? About the article: there is nothing new, indeed. However, why it worked: only brains and not the people worked together on a very specific problem. Plus maybe they were motivated by the idea of e-collaboration. No revolution.
  •  
    Joris, maybe I made myself not clear enough, but my point was only tangentially related to the tools. Indeed, it is the original article mention of "development of new online tools" which prompted my reply about emails. Let me try to say it more clearly: my point is that what they accomplished is nothing new methodologically (i.e., online collaboration of a loosely knit group of people), it is something that has been done countless times before. Do you think that now that it is mathematicians who are doing it makes it somehow special or different? Personally, I don't. You should come over to some mailing lists of mathematical open-source software (e.g., SAGE, Pari, ...), there's plenty of online collaborative research going on there :) I also disagree that, as you say, "in the case of Linux, what makes the project work is the rules they set and the management style (hierachy, meritocracy, review)". First of all I think the main engine of any collaboration like this is the objective, i.e., wanting to get something done. Rules emerge from self-organization later on, and they may be completely different from project to project, ranging from almost anarchy to BDFL (benevolent dictator for life) style. Given this kind of variety that can be observed in open-source projects today, I am very skeptical that any kind of management rule can be said to be universal (and I am pretty sure that the overwhelming majority of project organizers never went to any "management school"). Then there is the social aspect that Tamas mentions above. From my personal experience, communities that put technical merit above everything else tend to remain very small and generally become irrelevant. The ability to work and collaborate with others is the main asset the a participant of a community can bring. I've seen many times on the Linux kernel mailing list contributions deemed "technically superior" being disregarded and not considered for inclusion in the kernel because it was clear that
  •  
    hey, just catched up the discussion. For me what is very new is mainly the framework where this collaborative (open) work is applied. I haven't seen this kind of working openly in any other field of academic research (except for the Boinc type project which are very different, because relying on non specialists for the work to be done). This raise several problems, and mainly the one of the credit, which has not really been solved as I read in the wiki (is an article is written, who writes it, what are the names on the paper). They chose to refer to the project, and not to the individual researchers, as a temporary solution... It is not so surprising for me that this type of work has been first done in the domain of mathematics. Perhaps I have an ideal view of this community but it seems that the result obtained is more important than who obtained it... In many areas of research this is not the case, and one reason is how the research is financed. To obtain money you need to have (scientific) credit, and to have credit you need to have papers with your name on it... so this model of research does not fit in my opinion with the way research is governed. Anyway we had a discussion on the Ariadnet on how to use it, and one idea was to do this kind of collaborative research; idea that was quickly abandoned...
  •  
    I don't really see much the problem with giving credit. It is not the first time a group of researchers collectively take credit for a result under a group umbrella, e.g., see Nicolas Bourbaki: http://en.wikipedia.org/wiki/Bourbaki Again, if the research process is completely transparent and publicly accessible there's no way to fake contributions or to give undue credit, and one could cite without problems a group paper in his/her CV, research grant application, etc.
  •  
    Well my point was more that it could be a problem with how the actual system works. Let say you want a grant or a position, then the jury will count the number of papers with you as a first author, and the other papers (at least in France)... and look at the impact factor of these journals. Then you would have to set up a rule for classifying the authors (endless and pointless discussions), and give an impact factor to the group...?
  •  
    it seems that i should visit you guys at estec... :-)
  •  
    urgently!! btw: we will have the ACT christmas dinner on the 9th in the evening ... are you coming?
Dario Izzo

Optimal Control Probem in the CR3BP solved!!! - 7 views

  •  
    This guy solved a problem many people are trying to solve!!! The optimal control problem for the three body problem (restricted, circular) can be solved using continuation of the secondary gravity parameter and some clever adaptation of the boundary conditions!! His presentation was an eye opener ... making the work of many pretty useless now :)
  • ...13 more comments...
  •  
    Riemann hypothesis should be next... Which paper on the linked website is this exactly?
  •  
    hmmm, last year at the AIAA conference in Toronto I presented a continuation approach to design a DRO (three-body problem). Nothing new here unfortunately. I know the work of Caillau, although interesting what is presented was solved 10 years ago by others. The interest of his work is not in the applications (CR3BP), but in the research of particular regularity conditions that unfortunately make the problem limited practically. Look also at the work of Mingotti, Russel, Topputo and other for the (C)RTBP. Smart-One inspired a bunch of researchers :)
  •  
    Topputo and some of the others 'inspired' researchers you mention are actually here at the conference and they are all quite depressed :) Caillau really solves the problem: as a one single phase transfer, no tricks, no misconvergence, in general and using none of the usual cheats. What was produced so far by other were only local solutions valid for the particular case considered. In any case I will give him your paper, so that he knows he is working on already solved stuff :)
  •  
    Answer to Marek: the paper you may look at is: Discrete and differential homotopy in circular restricted three-body control
  •  
    Ah! with one single phase and a first order method then it is amazing (but it is still just the very particular CRTBP case). The trick is however the homotopy map he selected! Why this one? Any conjugate point? Did I misunderstood the title ? I solved in one phase with second order methods for the less restrictive problem RTBP or simply 3-body... but as a strict answer to your title the problem has been solved before. Nota: In "Russell, R. P., "Primer Vector Theory Applied to Global Low-Thrust Trade Studies," JGCD, Vol. 30, No. 2", he does solve the RTBP with a first order method in one phase.
  •  
    I think what is interesting is not what he solved, but how he solved the problem. But, are means more important than end ... I dunno
  •  
    I also loved his method, and it looked to me that is far more general than the CRTBP. As for the title of this post, OK maybe it is an exageration as it suggests that no solution was ever given before, on the other end, as Marek would say "come on guys!!!!!"
  •  
    The generality has to be checked. Don't you think his choice of mapping is too specific? he doesn't really demonstrate it works better than other. In addition, the minimum time choice make the problem very regular (i guess you've experienced that solving min time is much easier than mass max, optimality-wise). There is still a long way before maximum mass+RTBP, Topputo et al should be re-assured :p Did you give him my paper, he may find it interesting since I mention the homotopy on mu but for max mass:)
  •  
    Joris, that is the point I was excited abut, at the conference HE DID present solutions to the maximum mass problem!! One phase, from LEO to an orbit around the moon .. amazing :) You will find his presentation on line.... (according to the organizers) I gave him the reference to you paper anyway, but no pdf though as you did not upload it on our web pages and I could not find it in the web. So I gave him some bibliography I had with be from the russians, and from Russell, Petropoulos and Howell, As far as I know these are the only ones that can hope to compete with this guy!!
  •  
    for info only, my phd, in one phase: http://pdf.aiaa.org/preview/CDReadyMAST08_1856/PV2008_7363.pdf I prefered Mars than the dead rock Moon though!
  •  
    If you send me the pdf I can give it to the guy .. the link you gave contains only the first page ... (I have no access till monday to the AIAA thingy)
  •  
    this is why I like this Diigo thingy so much more than delicious ...
  •  
    What do you mean by this comment, Leopold? ;-) Jokes apart: I am following the Diigo thingy with Google Reader (rss). Obviously, I am getting the new postings. But if someone later on adds a comment to a post, then I can miss it, because the rss doesn't get updated. Not that it's a big problem, but do you guys have a better solution for this? How are you following these comments? (I know that if you have commented an entry, then you get the later updates in email.) (For example, in google reader I can see only the first 5 comments in this entry.)
  •  
    I like when there are discussions evolving around entries
  •  
    and on your problem with the RSS Tamas: its the same for me, you get the comments only for entries that you have posted or that you have commented on ...
Joris _

Is It Time To Revamp Systems Engineering? | AVIATION WEEK - 1 views

  • They both believe the systems engineering processes that have served the aerospace and defense community since pre-Apollo days are no longer adequate for the large and complex systems ­industry is now developing.
  •  
    1) it has to actively work and produce a result that's what you intended 2) the design must be robust. 3) it should be efficient 4) it should minimize unintended consequences. "But we have to establish a formal, mathematically precise mechanism to measure complexity and adaptability . . . [where] adaptability means the system elements have sufficient margin, and can serve multiple purposes." "We need to break the paradigm of long cycles from design to product" some interesting questions....
  • ...1 more comment...
  •  
    indeed ... already hotly debated in CDF ... any suggestions in addition to what we already contributed to this (e.g. system level optimisation)
  •  
    what is the outcome of the CDF study ? I think actually that optimisation is not at all the key point. As it is stressed in this news, it is robustness (points 2 and 4). This is something we should think about ...
  •  
    SYSTEM OF SYSTEMS, SYSTEM OF SYSTEMS!!! :-D
pacome delva

Neutron Star Formation Could Awaken the Vacuum | Physical Review Focus - 0 views

  • Lima and Vanzella joined with George Matsas of São Paulo State University in their latest work to examine a model of the highly-curved spacetime that appears during formation of an ultradense neutron star. For some reasonable values of the mass and size of the star, they predict that the vacuum energy will grow within milliseconds for some values of the coupling parameter. At this point the vacuum energy would begin to induce additional gravitational effects, which they haven't yet calculated, so they don't know how the star would be affected. If further research shows such a neutron star to be unstable, the existence of stable neutron stars of particular sizes could rule out the existence fields of the type they modeled.
pacome delva

Beetle beauty captured in silicon - physicsworld.com - 1 views

  • Researchers in Canada have created a new material that mimics the brilliant iridescent colours seen in beetle shells. As the eye-catching effect can be switched off with the simple addition of water, the researchers believe their new material could lead to applications including "smart windows".
Dario Izzo

If you're going to do good science, release the computer code too!!! - 3 views

  • Les Hatton, an international expert in software testing resident in the Universities of Kent and Kingston, carried out an extensive analysis of several million lines of scientific code. He showed that the software had an unacceptably high level of detectable inconsistencies.
  •  
    haha. this guy won't have any new friends with this article! I kind of agree but making your code public doesn't mean you are doing good science...and inversely! He takes experimental physics as a counter example but even there, some teams keep their little secrets on the details of the experiment to have a bit of advance on other labs. Research is competitive in its current state, and I think only collaborations can overcome this fact.
  • ...1 more comment...
  •  
    well sure competitiveness is good but to verify (and that should be the case for scientific experiments) the code should be public, it would be nice to have something like bibtex for code libraries or versions used.... :) btw I fully agree that the code should go public, I had lots of trouble reproducing (reprogramming) some papers in the past ... grr
  •  
    My view is that the only proper way to do scientific communication is full transparency: methodologies, tests, codes, etc. Everything else should be unacceptable. This should hold both for publicly funded science (for which there is the additional moral requirement to give back to the public domain what was produced with taxpayers' money) and privately-funded science (where the need to turn a profit should be of lesser importance than the proper application of the scientifc method).
  •  
    Same battle we are fighting since a few years....
Nina Nadine Ridder

Fla. airport gets OK for spaceport license - Space- msnbc.com - 0 views

  • Cecil Field becomes the country's eighth licensed commercial spaceport
  • In addition to suborbital passenger flights like those Virgin is offering, Cecil Field hopes to offer commercial orbital launch services
  •  
    Cecil Field becomes the country's eighth licensed commercial spaceport
LeopoldS

Self-organized adaptation of a simple neural circuit enables complex robot behaviour : ... - 3 views

  •  
    is this really worth a nature paper??
  •  
    Funny to read this question exactly from you, the all and ever fan of anything linked to bio :-) I have read worse papers in nature and in addition it's just "Nature physics", viz. "Nature garbage." Could be that they don't find enough really good stuff to publish in all their topical clones of Nature.
  •  
    francesco already posted this below
pacome delva

Mysterious 'dark flow' at the edge of the universe - 1 views

  • Cosmologists have already observed two distinct effects caused by invisible entities in the universe: dark matter is known to affect the rotation of galaxies and dark energy seems to be causing the expansion of the universe to accelerate. Dark flow is the latest addition to this shadowy family.
  •  
    I think Lucas didn't know he would have such an impact in science with Star Wars...!
  •  
    do you think it could be the dark side of The Force?
  •  
    what else...?
Nicholas Lan

artificial inorganic leaf (AIL) - 1 views

  •  
    bit confused about what they actually have achieved so far but sounds like it might turn out to be interesting. "The scientists first infiltrated the leaves of Anemone vitifolia -- a plant native to China -- with titanium dioxide in a two-step process. Using advanced spectroscopic techniques, the scientists were then able to confirm that the structural features in the leaf favorable for light harvesting were replicated in the new TiO2 structure. Excitingly, the AIL are eight times more active for hydrogen production than TiO2 that has not been "biotemplated" in that fashion. AILs also are more than three times as active as commercial photo-catalysts. Next, the scientists embedded nanoparticles of platinum into the leaf surface. Platinum, along with the nitrogen found naturally in the leaf, helps increase the activity of the artificial leaves by an additional factor of ten."
‹ Previous 21 - 40 of 48 Next ›
Showing 20 items per page