Swift Playgrounds is a wonderful introduction to programming. It introduces imperative logic, functions, methods, loops, and many of the marvelous APIs that are available to iOS developers.
But it’s called a “playground” for a reason: you can’t make an app with Swift Playgrounds. You play with code, you learn about code, and you do indeed code. But if you want to build something useful and distributable, you need to look elsewhere.
Some popular options are Codea, which allows you to build full apps in Lua on your iPad; Pythonista, which offers a Python IDE and a number of popular libraries to work with; and of course there’s always the cloud.
But inside the Apple ecosystem, this “elsewhere” is called Xcode. It’s a huge and complicated application that runs only on Macs, and requires an Apple Developer account to effectively distribute the software you build.
"It's not my fault, my brain implant made me do it" - 0 views
5 top code editors for programmers | Creative Bloq - 0 views
Transcript of "Inventing on Principle" : Inside 245-5D - 0 views
-
Doug Engelbart. Doug Engelbart basically invented interactive computing. The concept of putting information on a screen. Navigating through it. Looking at information in different ways. Pointing at things and manipulating them. He came up with all this at a time when real-time interaction with a computer was just almost unheard of. Today he is best known as the inventor of the mouse, but what he really invented is this entirely new way of working with knowledge. His explicit goal from the beginning was to enable mankind to solve the world's urgent problems. And his vision, he had this vision of what he called knowledge workers using complex powerful information tools to harness their collective intelligence. And he only got into computers because he had a hunch that these new things called computer things could help him realize that vision. Everything that he did was almost single-mindedly driven by pursuing this vision.
How to set up an iPad for web development - The Verge - 0 views
-
A few months ago, I detailed my process for setting up a Pixelbook to code on. It wasn’t easy or simple or straightforward, but it worked in the end: I had all the power and flexibility of Linux, access to my favorite code editor (VS Code), and, of course, the slick web browsing experience that Chromebooks are known for. Around that same time, I attempted to set up an iPad for coding. I failed miserably. I love using the iPad for writing and other creative work. It’s super portable, has great battery life, an excellent screen, and the limited multitasking keeps me focused. Unfortunately, it’s very bad for complex tasks and intentionally crippled for software development. But I’m older and wiser now, and after an entire Saturday spent bashing my head against a wall, I’m happy to report that I can use a $799 tablet computer to write software. Will I ever actually use it for this purpose? Maybe! But we’ll get to that.
The Coming Software Apocalypse - The Atlantic - 0 views
-
It’s been said that software is “eating the world.” More and more, critical systems that were once controlled mechanically, or by people, are coming to depend on code. This was perhaps never clearer than in the summer of 2015, when on a single day, United Airlines grounded its fleet because of a problem with its departure-management system; trading was suspended on the New York Stock Exchange after an upgrade; the front page of The Wall Street Journal’s website crashed; and Seattle’s 911 system went down again, this time because a different router failed. The simultaneous failure of so many software systems smelled at first of a coordinated cyberattack. Almost more frightening was the realization, late in the day, that it was just a coincidence.
-
“When we had electromechanical systems, we used to be able to test them exhaustively,” says Nancy Leveson, a professor of aeronautics and astronautics at the Massachusetts Institute of Technology who has been studying software safety for 35 years. She became known for her report on the Therac-25, a radiation-therapy machine that killed six patients because of a software error. “We used to be able to think through all the things it could do, all the states it could get into.” The electromechanical interlockings that controlled train movements at railroad crossings, for instance, only had so many configurations; a few sheets of paper could describe the whole system, and you could run physical trains against each configuration to see how it would behave. Once you’d built and tested it, you knew exactly what you were dealing with.
-
Software is different. Just by editing the text in a file somewhere, the same hunk of silicon can become an autopilot or an inventory-control system. This flexibility is software’s miracle, and its curse. Because it can be changed cheaply, software is constantly changed; and because it’s unmoored from anything physical—a program that is a thousand times more complex than another takes up the same actual space—it tends to grow without bound. “The problem,” Leveson wrote in a book, “is that we are attempting to build systems that are beyond our ability to intellectually manage.”
- ...11 more annotations...
In Search of God's Mathematical Perfect Proofs | WIRED - 0 views
-
Paul Erdős, the famously eccentric, peripatetic and prolific 20th-century mathematician, was fond of the idea that God has a celestial volume containing the perfect proof of every mathematical theorem. “This one is from The Book,” he would declare when he wanted to bestow his highest praise on a beautiful proof.
The Hilarious (and Terrifying?) Ways Algorithms Have Outsmarted Their Creators - 0 views
-
Flying saucers have yet to land—at least, not that we've confirmed—but alien intelligence is already here. As research into AI grows ever more ambitious and complex, these robot brains will challenge the fundamental assumptions of how we humans do things. And, as ever, the only true law of robotics is that computers will always do literally, exactly what you tell them to.
Developers love trendy new languages, but earn more with functional programming | Ars T... - 0 views
-
Developer Q&A site Stack Overflow performs an annual survey to find out more about the programmer community, and the latest set of results has just been published. JavaScript remains the most widely used programming language among professional developers, making that six years at the top for the lingua franca of Web development. Other Web tech including HTML (#2 in the ranking), CSS (#3), and PHP (#9). Business-oriented languages were also in wide use, with SQL at #4, Java at #5, and C# at #8. Shell scripting made a surprising showing at #6 (having not shown up at all in past years, which suggests that the questions have changed year-to-year), Python appeared at #7, and systems programming stalwart C++ rounded out the top 10. These aren't, however, the languages that developers necessarily want to use. Only three languages from the most-used top ten were in the most-loved list; Python (#3), JavaScript (#7), and C# (#8). For the third year running, that list was topped by Rust, the new systems programming language developed by Mozilla. Second on the list was Kotlin, which wasn't even in the top 20 last year. This new interest is likely due to Google's decision last year to bless the language as an official development language for Android. TypeScript, Microsoft's better JavaScript than JavaScript comes in at fourth, with Google's Go language coming in at fifth. Smalltalk, last year's second-most loved, is nowhere to be seen this time around. These languages may be well-liked, but it looks as if the big money is elsewhere. Globally, F# and OCaml are the top average earners, and in the US, Erlang, Scala, and OCaml are the ones to aim for. Visual Basic 6, Cobol, and CoffeeScript were the top three most-dreaded, which is news that will surprise nobody who is still maintaining Visual Basic 6 applications thousands of years after they were originally written. Stack Overflow also asked devs about one of today's hot-button issues: artificial intelligence. Only 20 percent of devs were worried about AI taking jobs (compared to 41 percent excited by that possibility—no doubt the Visual Basic 6 devs hope that one day computers will be able to do their jobs for them), but a remarkable 28 percent were concerned by AI intelligence surpassing human intelligence, and 29 percent concerned about algorithms making important decisions more generally. Among developers that actually know what they're talking about, however, the concerns seemed to shift: data scientists and machine-learning specialists were 1.5 times more likely to be concerned about algorithmic fairness of AI systems than they were any singularity. Even if AI is evil, most developers don't think it's the fault of the programmers. Fifty-eight percent say that ethics are the responsibility of upper management, 23 percent the inventor of the unethical idea, and just 20 percent think that they're the responsibility of the developer who actually wrote the code. If the Volkswagen emissions scandal is anything to judge by, the developers may not be completely off the mark; thus far, arrests appear to have been restricted to executives and engineers who designed the emissions test-defeating software, leaving the people who wrote the code unscathed.
Apple's Swift Programming Language Is Now Top Tier | WIRED - 0 views
-
Apple's programming language Swift is less than four years old, but a new report finds that it's already as popular as its predecessor, Apple's more established Objective-C language.Swift is now tied with Objective-C at number 10 in the rankings conducted by analyst firm RedMonk. It's hardly a surprise that programmers are interested in Apple's language, which can be used to build applications for the iPhone, Apple Watch, Macintosh computers, and even web applications. But the speed at which it jumped in the ranks is astonishing. Swift is the fastest growing language RedMonk has seen since it started compiling these rankings in 2011. Even Go, a programming language that Google released in 2009, hasn't been able to break into the top 10.The second fastest grower is Kotlin, which Google now officially supports on Android. It leaped from number 46 in the third quarter of 2017 to number 27 in January.RedMonk's rankings don't necessarily reflect whether companies are using these languages for real-world projects, or how many jobs are available for developers who know them. Instead, the firm tries to gauge how interested programmers are in these languages. Popularity among programmers could influence business decisions such as what languages to use for new projects.RedMonk compiles its rankings by looking at the number of questions people ask about each language on the question and answer site Stack Overflow as well as the number of projects using particular languages on the code hosting and collaboration site GitHub. The methodology was originally created by data scientists Drew Conway and John Myles White in 2010.Apple first released Swift in 2014. The idea was not just to make it easier for new developers to learn to program, but to simplify life for experienced coders as well. Many languages over the years have aimed to smooth the programming process by offering syntax that's easier to read or building in features that programmers otherwise commonly write from scratch. But these sorts of languages often produced applications that ran more slowly than ones written in more difficult programming languages. Swift aimed to combine programmer-friendly features with performance.Kotlin, which was created by the company JetBrains and officially released in 2016, has similar goals. What sets Kotlin apart is that it's compatible with the widely used Java programming language, which means programmers can include Java code in their Kotlin programs, or even write new features for Java applications using Kotlin. Kotlin had already received widespread attention from Java developers, but once Google announced full support for the language on Android, interest skyrocketed. RedMonk analyst Stephen O'Grady pointed out in the report that Kotlin’s Java roots could help it find its way into more places than Swift, such as large enterprise applications.Apart from the big gains for Swift and Kotlin, the RedMonk rankings changed fairly little this quarter. JavaScript and Java remained the two most popular languages, closely followed by Python, PHP, and C#. As O'Grady notes in the report, it’s becoming harder and harder for new languages to break into the top 20. That makes the rise of Swift and Kotlin all the more impressive.
Ubisoft's AI in Far Cry 5 and Watch Dogs could change gaming | WIRED UK - 0 views
-
AI has a new task: helping to keep the bugs out of video games.At the recent Ubisoft Developer Conference in Montreal, the French gaming company unveiled a new AI assistant for its developers. Dubbed Commit Assistant, the goal of the AI system is to catch bugs before they're ever committed into code, saving developers time and reducing the number of flaws that make it into a game before release. "I think like many good ideas, it's like 'how come we didn't think about that before?'," says Yves Jacquier, who heads up La Forge, Ubisoft's R&D division in Montreal. His department partners with local universities including McGill and Concordia to collaborate on research intended to advance the field of artificial intelligence as a whole, not just within the industry.La Forge fed Commit Assistant with roughly ten years' worth of code from across Ubisoft's software library, allowing it to learn where mistakes have historically been made, reference any corrections that were applied, and predict when a coder may be about to write a similar bug. "It's all about comparing the lines of code we've created in the past, the bugs that were created in them, and the bugs that were corrected, and finding a way to make links [between them] to provide us with a super-AI for programmers," explains Jacquier.
The tyranny of algorithms is part of our lives: soon they could rate everything we do |... - 0 views
-
For the past couple of years a big story about the future of China has been the focus of both fascination and horror. It is all about what the authorities in Beijing call “social credit”, and the kind of surveillance that is now within governments’ grasp. The official rhetoric is poetic. According to the documents, what is being developed will “allow the trustworthy to roam everywhere under heaven while making it hard for the discredited to take a single step”. As China moves into the newly solidified President Xi Jinping era, the basic plan is intended to be in place by 2020. Some of it will apply to businesses and officials, so as to address corruption and tackle such high-profile issues as poor food hygiene. But other elements will be focused on ordinary individuals, so that transgressions such as dodging transport fares and not caring sufficiently for your parents will mean penalties, while living the life of a good citizen will bring benefits and opportunities. Data is the new lifeblood of capitalism – don't hand corporate America control Read more Online behaviour will inevitably be a big part of what is monitored, and algorithms will be key to everything, though there remain doubts about whether something so ambitious will ever come to full fruition. One of the scheme’s basic aims is to use a vast amount of data to create individual ratings, which will decide people’s access – or lack of it – to everything from travel to jobs. The Chinese notion of credit – or xinyong – has a cultural meaning that relates to moral ideas of honesty and trust. There are up to 30 local social credit pilots run by local authorities, in huge cities such as Shanghai and Hangzhou and much smaller towns. Meanwhile, eight ostensibly private companies have been trialling a different set of rating systems, which seem to chime with the government’s controlling objectives.
Here are the best programming languages to learn in 2018 - 0 views
China Deploys "SkyNet" Facial Recognition, Can Compare 3 Billion Faces Per Second - 0 views
1 - 16
Showing 20▼ items per page