This is a bibliography management system one of my project partners suggested. I (Tobias) am trying it out in the frame of the project. If you do not hear anything from me, it was probabely not sooo successfull...
Albeit a bit dated, this is the classical Eric Raymond paper about the self-organizing open source model (the bazaar) compared to the usual closed software development model (the cathedral).
Is science today more a bazaar or a cathedral?
funny ....
this is exactly the book that Franco mentioned during one of the first meetings I had with him on the ACT, our research, how to organise, the potential of new ways of cooperating etc ...
Well, nice, but I wonder how much cache per core will be available... With 48 cores a single memory bus becomes nothing more than one big (small? :) ) bottleneck.
Apparently they have separated L2 cache per-tile (i.e., every two processors) and a high speed bus connecting the tiles. As usual, whether it will be fast enough will depend from the specific applications (which BTW is also true for other current multi-core architectures).
The nice thing is of course that porting software to this architecture will be one order of magnitude less difficult than going to Tesla/Fermi/CELL architectures. Also, this architecture will also be suitable for other tasks than floating point computations (damn engineers polluting computer science :P) and it has the potential to be more future-proof than other solutions.
thanks - I was already wondering several times what had happened to this technique that he used at the talk we looked at several times when it was first uploaded ... good that they have made it open source! are they easy to use?
the easiest way to use them is:
Google Docs > open/create a spreadsheet > Insert > Gadget > Charts > Motion Chart !! :)
You have here a tutorial describing all the steps to get it running.
Big data analytic systems are reputed to be capable of finding a needle in a universe of haystacks without having to know what a needle looks like. The very best ways to sort large databases of unstructured text is to use a technique called Latent Dirichlet allocation (LDA). Unfortunately, LDA is also inaccurate enough at some tasks that the results of any topic model created with it are essentially meaningless, according to Luis Amaral, a physicist whose specialty is the mathematical analysis of complex systems and networks in the real world and one of the senior researchers on the multidisciplinary team from Northwestern University that wrote the paper.
Even for an easy case, big data analysis is proving to be far more complicated than many of the companies selling analysis software want people to believe.
Most of those companies are using outdated algorithms like this LDA and just apply them like retards on those huge datasets. Of course they're going to come out with bad solutions. No amount of data can make up for bad algorithms.
In Interstellar, the science-fiction film out this week, Matthew McConaughey stars as an astronaut contending with a supermassive black hole called Gargantua. The film's special effects have been hailed as the most realistic depiction ever made of this type of cosmic object. But astrophysicists have now gone one better - this is a really cool visualisation done by researchers in Cornell.
Several new components for biological circuits have been developed by researchers. These components are key building blocks for constructing precisely functioning and programmable bio-computers. "The ability to combine biological components at will in a modular, plug-and-play fashion means that we now approach the stage when the concept of programming as we know it from software engineering can be applied to biological computers.
TensorFlow is an open source software library for numerical computation using data flow graphs. Nodes in the graph represent mathematical operations, while the graph edges represent the multidimensional data arrays (tensors) communicated between them. The flexible architecture allows you to deploy computation to one or more CPUs or GPUs in a desktop, server, or mobile device with a single API. TensorFlow was originally developed by researchers and engineers working on the Google Brain Team within Google's Machine Intelligence research organization for the purposes of conducting machine learning and deep neural networks research, but the system is general enough to be applicable in a wide variety of other domains as well.
The EFF communicate is actually quite inaccurate. This is disappointing from the EFF, though for some part, it is due to the communication from the researchers who "discovered" the attack.
PGP itself is not broken, but rather some implementations on some email clients (notably Enigmail, though it was patched several months ago). See https://protonmail.com/blog/pgp-vulnerability-efail/
On the other hand, if you are very keen on security, there is an XSS attack reported on Signal, so… https://thehackernews.com/2018/05/signal-messenger-code-injection.html
The *good* recommendation here is actually rather to keep your software stack up to date (surprising, no?) and keep encrypting your emails.
"Hinton's new approach, known as capsule networks, is a twist on neural networks intended to make machines better able to understand the world through images or video. In one of the papers posted last week, Hinton's capsule networks matched the accuracy of the best previous techniques on a standard test of how well software can learn to recognize handwritten digits."
Links to papers:
https://arxiv.org/abs/1710.09829https://openreview.net/forum?id=HJWLfGWRb¬eId=HJWLfGWRb
seems a very impressive guy :"Hinton formed his intuition that vision systems need such an inbuilt sense of geometry in 1979, when he was trying to figure out how humans use mental imagery. He first laid out a preliminary design for capsule networks in 2011. The fuller picture released last week was long anticipated by researchers in the field. "Everyone has been waiting for it and looking for the next great leap from Geoff," says Kyunghyun Cho, a professor"