I don't git it, either...
Git is the source code version control system that is rapidly becoming the standard for open source projects. It has a powerful distributed model which allows advanced users to do tricky things with branches, and rewriting history. What a pity that itís so hard to learn, has such an unpleasant command line interface, and treats its users with such utter contempt.
Linux moves closer to ARM. Microsoft is already there. Apple's been there, done that. How long can Intel hold on?
Linus Torvalds has officially announced that version 3.7 of the Linux kernel has gone stable, and that means good news for developers who work with ARM-based CPUs: among its other changes, Linux 3.7 is the first Linux kernel to include generic support for multiple ARM CPU architectures, reducing the amount of effort required to get Linux-based operating systems running on phones, tablets, and ARM-licensed developer boards like the Raspberry Pi.
Really interesting history, proving that design tradeoffs, compromises and shortcuts can still lead to great products.
It is arguable that ARM and Intel, the two companies locked in head-to-head processor competition, represent two different poles and philosophies.
The Science Guy lays it down - I gotta find the video this references!
Bill Nye may still be The Science Guy, but he's no longer Mr. Nice Guy.
Update:Damn, I wish this was true.
Interesting questions! When I was a kid I had a model rocket that had a still camera as it's payload - it snapped a picture when the parachute deployed. Was that spying?
Better yet - what if one of this fellows neighbors shoots down his UAV over the neighbors property? Who's liable for what?
Now does the illustration make sense?
My poor kitten, who my unfortunate Instagram contacts know too well, gets beat up every time he goes outside. There's a bully cat in the neighborhood who appears to relish in attacking cute, fluffy things as soon as they get out of human oversight. So, naturally, I bought a Parrot AR.Drone.2.0, a remote-controlled quadcopter with an HD camera attached, to see if I could spot where the punk bully cat hangs out.
Fascinating discussion of the history and future of artificial intelligence research.
The lack of progress in AGI is due to a severe logjam of misconceptions. Without Popperian epistemology, one cannot even begin to guess what detailed functionality must be achieved to make an AGI. And Popperian epistemology is not widely known, let alone understood well enough to be applied. Thinking of an AGI as a machine for translating experiences, rewards and punishments into ideas (or worse, just into behaviours) is like trying to cure infectious diseases by balancing bodily humours: futile because it is rooted in an archaic and wildly mistaken world view.
For the terminally curious, in detail.
Web browsers are probably the most widely used software. In this book I will explain how they work behind the scenes. We will see what happens when you type 'google.com' in the address bar until you see the Google page on the browser screen.
This perfectly elucidates my main objection to e-books - enabling a digital dark age.
An e-book is not a physical book. That point might seem trite until you stop for a moment to think how much simpler it is, in a certain sense, to destroy electronic than physical traces. There's no need of inciting mass cooperation in book-burning enterprises. No need for secret police or raids or extensive surveillance. The power to remove a book from a device, to remove all traces of it from retailers' websites, to expunge it from a publisher's online record: It would simplify the work of a would-be Soviet Union or Oceania multifold, would it not? It's ugly. For all kinds of reasons.
While this concentrates on medical research, the problem is much more widespread than that - remember cold fusion?
More than half of biomedical findings cannot be reproduced Ė we urgently need a way to ensure that discoveries are properly checked.
Some good advice in here - most of which I follow ...
As a software engineer, you might want any number of things out of your job - a steady paycheck, the opportunity to work on interesting projects, a springboard to the next better job, or maybe you just like hanging out with other programmers. But by "effective", I mean the ability to complete projects in a timely manner with the expected quality. After working on dozens of software releases, I believe the following practices will bring you there, and while they may involve sticking your neck out, I'd like to think they will also advance your professional reputation, career longevity, and personal satisfaction.
In short, stop trying to quantify the unquantifiable. Sound advice for any endeavor.
Every softer discipline these days seems to feel inadequate unless it becomes harder, more quantifiable, more scientific, more precise. That, it seems, would confer some sort of missing legitimacy in our computerized, digitized, number-happy world. But does it really? Or is it actually undermining the very heart of each discipline that falls into the trap of data, numbers, statistics, and charts? Because hereís the truth: most of these disciplines arenít quantifiable, scientific, or precise. They are messy and complicated. And when you try to straighten out the tangle, you may find that you lose far more than you gain.
Absolutely excellent bit on the real art of programming (or engineering, or life in general) - reducing complexity. It's a goal I strive for everywhere, from the breeds we choose for our flock to the design of water systems in the barn to my choice of programming languages and tools. Simple is invariably better, and it's nice to see this truth explained in such a clear and cogent manner.
I would venture to say that most software developers have some sort of belief that they are just a regular programmer, but there exists out there some super programmers who actually do the difficult algorithms that control caches on hard drives and index search results for Google
Network and endpoint security may not strike you as the first place to scratch an experimental itch. After all, protecting the company's systems and data should call into question any action that may introduce risk. But IT security threats constantly evolve, and sometimes you have to think outside the box to keep ahead of the more ingenious evildoers.
The full title of the piece is "Steve Wozniak: Cloud Computing Will Cause 'Horrible Problems In The Next Five Years'", and when I ran across it today I muttered my agreement. I made a pretty good living in the late 80's and early 90's migrating data from remote mainframes to local computers and networks, and see "the cloud" as just another incarnation of Big Iron. I got quite a kick out of some of the comments on the piece, accusing the Woz of being Old School, and a borderline Luddite.
Lo and behold, tonight I find confirmation of the Apple wizard's worst fears - not five years hence, but today: How Apple and Amazon Security Flaws Led to My Epic Hacking. Read it and be afraid - very afraid.
Steve Wozniak really, really doesn't like the cloud.
Fine bit of writing on what makes a good programmer.
The most frequently viewed page on this site is Signs you're a bad programmer, which has also now been published on dead trees by Hacker Monthly, and I think that behoves me to write its antithesis. "Bad programmer" is also considered inflammatory by some who think I'm speaking down to them. Not so; it was personal catharsis from an author who exhibited many of those problems himself. And what I think made the article popular was the "remedies"--I didn't want someone to get depressed when they recognized themselves, I wanted to be constructive.