jueves, 5 de abril de 2018

A self driving conundrum.

The other day I was coding a web page, I was using the framework Vue, you don´t have to know what Vue is to understand this next part, the only thing you have to know is that with Vue you need to program things differently, so I was making a button to simply change the page and it was giving me difficulties, so I coded this:

<a class="d-block mb-4 h-100" href="#" v-on:click="change_page"></a>
Thing is, I tried and I tried and I just could't get it to work, Thats when my friend told me to delete href="#" because I didn't need there, it was less than a line of code yet it was the source of all my problems. The point that I'm trying to make here, Is that we all make mistakes, and small things like that can be overlooked. Maybe this one time it was only a button on a web page, but other times it can be more critical things, and mistakes like that can literally life threatening. Take for example the launch of the Mariner I rocket by NASA, it exploded mid flight because the person coding into the computer forgot, or missed really a hyphen. This time the where no astronauts on board, but in other cases, simple errors, coding poorly, or even just bad practices can result in a deadly situcions, take for example the case of the Therac-25 where bad practices by programmers caused the death of multiple hospital patients.

I write all this to bring an interesting topic into the table, we, programmers, have way more responsibilities in our hands that we may think. Recently, a self driving car accidentally hit a person, this is the first time a pedestrian dies due to an accident with a self driving car, even doe it is impressive that so little casualties where caused by the technology, it is still a tragedy. But more information has been revealed about the incident, people like to say that this scary new tech is to blame, but footage of the accident (that I'm not gonna link put of respect for the victim) shows that the victim was jaywalking in the middle of the dark road. So there is a chance that the cars sensors couldn't react fast enough. But what if that wasn't the case, this is where the conundrum part of the title comes into play, could it have been, by any chance, that the vehicle's sensors did detect the approaching pedestrian, computers are better at seeing things after all, thats what they where built to do. Someone has to take the blame here, there is always someone at fault in a traffic accident, if not all. However, if the sensors did pick up the person crossing, this would be a mistake not made at a split second, but actually months ago, or whenever the code for the car was made, it would be another example of a deadly error, something overlooked, and in that case the tech company is to blame. 

Self driving automobiles are the cause of excitement and fear. It can be the solution to all our traffic problems, it offers great benefits, but many people get anxious about the subject, and I hear them, cases like the one discussed above can be very worrisome. And then have the ethical dilemas, the following Ted-Ed video describes this hypothetical situation, that, even if its just a thought experiment, its important to take into consideration when we are building the tools of the future.

(The ethical dilemma of self-driving cars - Patrick Lin)

We have to think about all this cases and their outcomes when building the tools of the future, we cannot be thorough enough, and as security is concern, perfection is impossible. As we talk about security we cannot risk it being to confident, if people can hack into just regular cars that are out today, what could they do to automated cars that rely so much in their software and the internet. This is why we have to be smart about the stuff we develop, not just cars, we have to update constantly to be ahead of the curve, protect our systems using layers, have good coding practices, protect software AND hardware, etc.

When discussing the topic in class a classmate asked if we are ready for fully automated cars? My I think that we are never gonna be, so we might as well start using them now. Maybe not this right moment, but, like with any new technology, we will start solving the problem as we get there, there are many hurdles that we don't even know we have to jump, issues arise as we approach them. What do you think? Are we ready for an automated world? Should we move on with the self driving cars? Or should we wait? Or maybe don't have self driving cars at all, too much risk? 

Whatever the future may be, If we learn from our mistakes I can be confident it will be great. 

No hay comentarios:

Publicar un comentario

SEO test

This is not allowed: this isnt't allowed either:  this is allowed: