Oh crap, that’s scary.
An unidentified drone came close to hitting a plane as it landed at Heathrow, the Civil Aviation Authority (CAA) has confirmed.
An Airbus A320 pilot reported seeing a helicopter-style drone as the jet was 700 feet off the ground on its approach to the runway at 1416 GMT on 22 July.
The CAA has not identified the airline or how close the drone came to the plane, which can carry 180 people.
It gave the incident an “A” rating, meaning a “serious risk of collision”.
This is the highest incident rating the CAA can give.
Now add this part:
The incidents have prompted a warning from the British Airline Pilots’ Association (Balpa) that the rapid increase in the number of drones operated by amateur enthusiasts now poses “a real risk” to commercial aircraft.
And now this part:
Sales of drones have increased rapidly, with UK sales running at a rate of between 1,000 and 2,000 every month.
They are expected to be very popular as Christmas presents.
They cost as little as £35 for a smaller model…
Jesus fucking christ.
Holms says
Reminds me of the threat posed by laser strikes from those little keychain-laser thingoes. How people can be so cavalier with hundreds of people in the air (and more below them if the plane crashes) is beyond me.
Lofty says
Drones are the genie that won’t let itself be pushed back into its bottle.
sonofrojblake says
The sky will be thick with these things within a decade. They are simply too useful and cheap. There’s no way to ban them because they are ridiculously easy to build, and they literally fly themselves – no skill is required. You don’t even necessarily need a radio control – they can be programmed to follow a predetermined flightpath by GPS, no external control required.
I fly paragliders cross country. The advent of pilotless aircraft that can’t see me but could kill me is an imminent and direct danger to me, but that danger is trivial compared to the potential mischief bad actors could wreak with them. I’m only surprised worse hasn’t already happened.
latsot says
I can see incidents like this being used as an excuse by authorities to force a backdoor into drones to track where they go and who sent them there.
sonofrojblake says
You don’t understand – such a backdoor would be effective only for off-the-shelf toys. It is already trivially easy and cheap to buy generic electronics for these things piecemeal and build one from scratch with capabilities that would terrify you. For a total cost of less than $2,000 I could go out today and buy the parts to build a drone that could carry a hand grenade through any open window you care to nominate. I offer you the choice of either entering the GPS coordinates you want it to fly to and letting it get there itself, or if you don’t trust that, you can guide it yourself using an onboard video camera. I say “guide” rather than “pilot” or “fly”, because the thing pilots itself – keeps itself stable and on-station automatically. You have as much control over it as you do a taxicab – it’ll do anything you say, and complex business of actually doing it is taken out of your hands.
Any generic chipset that was known to have tracking backdoors would soon be superseded in the market by an alternative that lacked such a feature. Furthermore, such a backdoor would only be any use after the event assuming the chipset could be recovered intact enough to be readable and the purchaser was honest about their identity.
If you’re not scared by what civilian drones will be used for over the next decade, the only reason is that you don’t know enough.
latsot says
@sonofrojblake
I understand perfectly.
But none of that would prevent governments from spying on off-the-shelf models, even though it wouldn’t be effective at catching baddies. That lack of effectiveness hasn’t stopped the UK government from spying on everyone’s emails and phone calls, for example, or from passing laws to make that legal (and extendible whenever they feel like it).
I didn’t say I wasn’t scared about what civilian drones will do over the next decade and your assumptions about my knowledge on these matters are incorrect. But I’m also scared about what governments will do in the name of fear – warranted or otherwise – to compromise everyone’s privacy, especially since – as you point out – a measure like this won’t be effective against most threats.
This is a systematic effort by the UK, US and other governments to erode our privacy without a transparent threat analysis. I’m perfectly aware that just about anyone could build a drone to fly a bomb through a window or weapons over security fences, which could make a lot of traditional security barriers obsolete. But we won’t know how much we should be scared about particular threats because we haven’t seen a threat analysis from the people who ought to understand those threats the best. This means we don’t know if giving up certain privacies and freedoms in order to combat these threats is a good bargain.
I’m not against a population giving up freedoms to protect itself, but I’m very much against it being hoodwinked into that decision by a government that lies (directly or by omission) about the nature of the threats.
sonofrojblake says
If I bought, say, a DJI Phantom, I’d be fine with the police being able, with a warrant, to interrogate some software log on it to see if I’d flown it somewhere I shouldn’t. Flying a drone isn’t a fundamental human right or anything. If all off-the-shelf drones were required to be trackable, that requirement would do far more to protect me than it would to harm me. It would not be effective at catching “baddies”, but it would be better than nothing at catching dilettante irresponsible arseholes messing about in controlled airspace. For every nutbar intent on bringing down an airliner for Allah, there are probably a thousand redneck morons who might do so by accident while just hooning about near an airport. The first guy we must needs leave to the spooks, while I’m keen to give the normal police whatever powers they need to deal with the latter, because they’re the real, common, everyday threat. As these things get cheaper, they’re only going to get more common.
For comparison, if all CARS were required to be trackable, that would be harder to argue. But I’ll bet you anything you like that when, in ten or twenty years’ time, self-driving cars become commonplace, tracking of them will be mandatory. And I’ll be fine with that, too, for the same reasons.
latsot says
Sorry for the long comment, doubly so if it’s inappropriate to have this conversation here.
TL;DR: giving up freedoms is dangerous but often necessary, let’s make sure we’re doing it right. Also, I don’t know the answers in either the drone or car or car-drone examples.
@sonofrojblake
That’s fine, it’s a decision everyone is capable of making for themselves, providing they understand the consequences. Both current consequences and future ones. The problem with authorities having access to this kind of data is that it’s a very slippery slope indeed as we’ve seen many times. Once they have some data under certain conditions they *never* fail to push for more, under looser conditions and also *never* fail to use fear – misinformed or otherwise – as a tool to get their way.
I’m not saying that drones shouldn’t be trackable. I’m saying that we need to be aware of what we’re giving up when we legislate about it.
Isn’t it? I’m not sure why not. It might be argued that it’s a fundamental but restricted human right, possibly. Is bouncing a ball or playing with Lego a fundamental human right? But this is a digression. I’m not arguing for the right to fly drones, I’m arguing for the right to know what freedoms we have; which ones we’re giving up when legislation happens (including what – if any – restrictions there are on how the legislation could change in the future); to know whether the freedoms we give up are actually likely to reduce the threats we’re told they’ll reduce; and then later, whether or not they actually did. I think societies also need more carefully-defined mechanisms for taking certain privacies back if they have proved to be ineffective or the cost to privacy has proven too great.
With respect, you don’t know that. Nobody does. Nobody knows it because we don’t know how that data will be used in the future.
I don’t disagree. I just don’t know whether that threat is worth the privacy loss. Doing a proper threat analysis with all the data available would help us understand this better. Would tracking be an effective countermeasure? Would it cause additional security problems? What privacy problems could it cause? We don’t know any of those things, so we shouldn’t agree to the countermeasure offhand, even if it seems at face value like a no-brainer.
In the first case, we have both argued that tracking wouldn’t be an effective countermeasure, for the reasons you stated in your first post. Of course it’s the spooks’ job to deal with it, but their powers should certainly be limited. I can see that there are reasons to keep secret from the public certain things about the information collected, stored and processed about them. But there are nevertheless privacy bargains that should not be made in the name of safety.
In the second case, I had to raise my eyebrows at the idea of “giving the normal police everything they need to…” I have some caveats. If that means “everything they *actually* need, rather than what they *say* they need, then OK in principle. But only under the right conditions. Needing a warrant or other transparent and well-understood procedure to get a person’s data is reasonable requirement, but in these days of big data it is becoming less of a protection. I’ll explain why in a moment.
I agree that governments will strongly argue for that. I think they’ll be pushing for it far before self-driving cars become common. I agree that most governments will get their way and our cars will end up being tracked, one way or another. I also understand why lots of people would be fine with that and I respect their right to agree. But I’d still campaign against it and for better transparency and control by the public over what happens to our data.
Here are some problems:
* We don’t know what sort of mission creep will occur, but we can be virtually certain that it will occur. It’s too tempting to do otherwise and history has shown it to be more or less inevitable. When we legislate to remove a freedom, it’s like a ratchet: very difficult to get the freedom back (we have to disassemble the mechanism) but very, very easy to give up more. With a click. For example, if we legislate to collect everyone’s phone and email metadata, accessible only via a well-understood process, given certain criteria, it wouldn’t be difficult to relax those criteria during ’emergencies’. And then easier still to relax the definition of ’emergency’. It wouldn’t take much to reach a situation where our data is being mined, rather than accessed only when absolutely necessary. Mining is bad, especially when in conjunction with:
* We don’t know how data may be combined with other data. One of the perils of “big data” (although I despise that term) is that we have lots of little islands of data that are perhaps innocuous on their own, but might reveal more about us than we’d like when joined together. It’s becoming easier every day to bridge those islands. Suppose data from our cars is linked with our comms data and our social media. This sort of thing can be used to generate suspicion, which is to say to relax the criteria for investigating someone. If my car records showed I (or at least my car) was in the rough vicinity of a terrorist attack (and ‘vicinity’ could and would be quite arbitrarily defined) and I’d commented on a tweet in sympathy of a cause related to those terrorists’, would I automatically become a suspect even if there was nothing else linking me to the crime? It might be argued that it’d be OK if I were because it’s better to be safe than sorry. If I were, that would presumably enable the police (and spooks) to investigate more of my data. What if that implicated me in another crime? You can always dig up something about anyone that at least looks bad, partly because:
* When we are victims of mass surveillance, we have lost control of our own narrative. I recently read an article about how people are posting embarrassing photographs of themselves on social media so they can control the caption. They’d rather say “OMG, look at this bad picture of me” than someone else post the picture and say “look at this drunken idiot, LOL”. Think about pictures of celebrities taken when they’re asleep on a plane or coming out of a nightclub into the dark then having a flash go off unexpectedly in their face and tabloid headlines saying they look stoned. The article also mentioned a situation on Foursquare which was meant by the user in one way but could be negatively interpreted by a reader in another, sinister way. We need to be in control of the narratives that can be put together from our islands of data because:
* Everyone has something to hide. That might be because we’ve broken the law, because we’ve done something we’re not proud of or because we’ve done or we are something we feel is just nobody else’s business. And let’s not forget:
* None of this information is safe. No security system is perfect. Mass surveillance means that the data that can be leaked about us by illicit means is much more damaging than ever before. There’s a whole new set of threats to worry about if someone other than government gets hold of this data. And let’s not forget that governments have a history of leaking data on individuals when it suits them.
OK, so looooong comment and yet not long enough to make me not sound like a conspiracy theorist or to explain these issues properly. I don’t think that tracking drone data will automatically lead to the ruination of society, but it will be another tool in our governments’ quest for mass surveillance of their citizens.
I’d like people to be able to fly their drones or drive their cars or conduct their daily activities in ways that don’t hurt people without the possibility that every innocent person’s privacy will be compromised.
moarscienceplz says
I was into model rockets when I was a kid. One of the rules the rocket companies constantly impressed on us was to NEVER operate anywhere near where airplanes flew. I don’t believe I have ever heard of a complaint of rockets interfering with aircraft. What the heck is wrong with these drone operators?
F [i'm not here, i'm gone] says
moarscienceplz
The problem with the operators is ubiquity, low barrier to entry, less work and geerkery involved, sustained controlled flight and easy retrievability. Plus all the stuff drones can pack (like cameras) make them far more amusing for the average twit. You just have a much larger population (therefore more twits) buying drones or having access to them, and they can play with them for extended periods of time.
By the time we have assholes hovering them outside our windows and such on a regular basis, I’m hoping the glue-foam gun is as easily obtainable and usable (with a very low unintended consequence factor) for dealing with these.
Pen says
That doesn’t make sense. A bomb is easy to build but you can still ban them.
sonofrojblake says
Read what you quoted. I didn’t say “easy”, I said “ridiculously easy”. As in, requires next to no knowledge of aerodynamics, electronics, radio control, GPS, programming or much of anything. Requires no hard-to-acquire materials. Requires no difficult experimentation. Poses absolutely no risk. This last is the kicker. Sure, building a bomb is easy. Building a bomb that actually works and kills other people and not you, isn’t. Time and again when I was growing up there’d be news stories about Irish people who were killed by bombs they’d been intending to use to kill English people. Oh how we laughed.
Also, there’s no legitimate use for a bomb. There are literally hundreds if not thousands of legit uses for a drone. Being caught with one is zero-risk, right now, and likely always.
Analogy fail.
latsot says
Nothing. You don’t feel comfotab…. oh wait a moment, have we heard that someplace before? What complaints do you have? Who do you imagine you can address those complaints to?