Comedian and author Baratunde Thurston | Code 2019



yes yes I too I'm still recovering from the carbon and noise emission pollution of the human flight experiment that we were just objected to here's so it's very very like the place I think yes we should like I know she all about fair elections but let's just make her them it'll be dope I'm here because of this woman Arnie the Lorraine Thurston who was a domestic worker who was a survivor of sexual assault who was a computer programmer who never graduated from college and worked for the federal government in the late 70s and early 1980s enabling so much of my life I had really early access to technology and it made everything a lot doper for me simpler time I in 1998 and most jobs that I have had extracurricular activities created I just before she is called the eye it's remarks I said the following this was three years ago March 2016 the they aren't pure or objective like journalists they're embedded with the values of their makers they reflect the society around them Asian is all about making the world a better place in the algorithm to include code that can't claim to do so derive from this very could we have lines and we've been talking about so many of them I mean losing access to our information whether it's self-driving cars litter happening and of course there's kind of the chief apologizer of them all facebook's really more of a cloud service at this point i think of it as apology as a service or ass did their multi-cloud is my hybrid cloud for you but there is another way that we can pursue our future last year I wrote a doubleheader piece for medium where I tried to understand all the data that was out there about me did a data detox if you will and came out of that with a set of principles that I thought should guide us into the future I didn't open source that document using a massive platform company called Google and people have added to it so what I'm going to share with you are nine ways that I think we can dig our way out of this hole and build the Internet in the future we were promised and that I think we deserve the first thing is the notion of transparency and Trust cores and I mean real transparency we are all judged by a number most of us a credit score that determines our ability to get jobs and housing I think it's time to flip that around and start judging the companies and organizations that mismanage our data in a much more transparent way a sort of data nutrition label where when we want to know what's in our food we don't drag chemistry sets to the grocery store and test individual parts we read the label that our government has created a standard around and it helps us become informed consumers and citizens in the process same with lead certification we could have a stamp that let us know what the employee access is have there been breaches or what sorts of activities our companies and organizations doing with our information and further I think it's so many of these organizations are very good very clear very convincing when it comes time to sign us up and take our money but when it comes time to explain what they're doing with our information all the sudden I need a law degree to understand that communication where's that growth hacking mindset where those slick visualizations when it comes to explaining what you're doing with this essential part of me the second step is to change the default settings from open to closed we've lived too long in a world where defaults matter and we default to hoarding data and information about customers and users of these services I propose a much more lean approach Mozilla has a practice called lean data practices which I encourage every builder to take a look at we could switch to something where you treat data like things you want to limit sugar Netflix cable news carbon and try to get as much done as possible with as little data as possible be a data conservationist Third Point let's shift the mindset around data ownership and portability I think a mature lawyer in here we should start treating people's data as a part of their property whether I actively generated or it's derived from my behavior whether it's of me or about me whether it's the content or the metadata it's an extension of my being and so you respect that the same way you hopefully respect other human beings bodies and obviously we're still working on that in a number of different categories there's a further element of this that when we are in enlisted in unpaid labor on behalf of massive organizations to get access to our own information that should be recognized I'm basically a co-owner of ways at this point as well as way mo for all the stoplights and crosswalks I've identified on behalf of Google fourth point I think there's a language opportunity to shift away from just privacy to permission to words like consent agency self-determination and sovereignty privacy has this binary fuel to either private or it's not permission is something that can be granted it can be temporary it can be context-sensitive but we should open up the level of nuance to allow people to have much more control over ourselves and how we're represented and speaking of representation we've lived forever in a world of systemic exclusion I argue for systemic inclusion and this is not about just doing nice things or being charitable it's about being relevant it's about asking who's been over-served and who's been underserved and let's use this superpower for something that actually shifts that balance a little bit number six imagine harder there's a game that anyone building a service could play called what's the worst thing that could happen and I think we need to be playing that game much more before we press the launch button than after and figuring out on the other side so there's a great product and process called ethical OS that Omid er helps fund that asks some of these essential questions as you're mapping out your products and your services bring these questions to the table before that launch button is pressed number seven we got to break open the black box there's been such mystery about how these services are recommending what video for you to play who might be a friend where you should spend your money it doesn't need to be that way we have other industries where there is a level of accountability and inspection peer review food inspectors auditors journalists play this function I think as we get more and more AI and machine learning algorithms writing their own machine learning algorithms we could use a similar approach that we can challenge inspect and measure the impact of what these systems are putting out into the world because just trusting the machines is not a good idea just trusting the politician also not a good idea we must be constantly vigorous in this number eight we need to upgrade and enforce the rules one thing I'd love to see is the Data Protection Agency I'm talking flak vest pocket protesters protesters yes pocket protesters and protectors they bust into the open plan office space they get a whiff that you're illegally listening in on people's conversations with their friends and then serving them ads about it later but denying it all the while even though we all know it's true you pay a price for that paying the price is how we enforce against the behavior we don't want to see anymore it's a very simple thing we do with people who are caught driving or operating while intoxicated or under the influence we take away their license temporarily we take away the vehicle at least temporarily and we don't allow them to cause further harm yet we have a massive set of violations of all of our sovereignty and our self-determination and companies just keep doing the same old they are still trusted with the data that they clearly cannot be trusted with increase the price and all the sudden I think they'll make those security investment the last point is that we should be empowering more than just consumption we've taken the greatest technologies the most advanced math and some of the greatest minds the world has ever known and we've used them to push advertising on people and to solicit money from them that they probably didn't want to part with in the first place that's a low bar for the human experience of human existence let's aim a little higher one of the things I like to say is let me do what you can do I remember the first time I used Facebook as an ad buyer as opposed to as a friend and I saw my friends in a whole new way I was able to organize them I could understand what they cared about I couldn't see them more deeply unless I saw them as an object unless I saw them as something to be targeted and extracted from but as the human relating with them I had no access to this whatsoever we have over invested in the exploitation of our relationships and under invested in the sincerity and depth and heart of them and we don't have to make that choice every time apply the superpower that we're sitting on to other opportunities and to other problems we're just at the beginning of this it doesn't have to end this way there's one example that I'm really excited to share I have nothing to do with it but I'm just proud of them there's an organization called just fix dot NYC and they have used many of the technologies that we are debating about and celebrating and investing in today to help people who many of us ignore folks living in low-income housing and subsidized housing with landlords who are essentially slumlords and they're making it easier for folks to file complaints to get in touch with housing court to organize they're using the same exact tools for a much higher purpose and it's one of thousands of examples but they should be even millions more in this particular direction now I'm going to leave you with some homework some reading assignments of also have this whole deck available online I really want you to use it in your organization's at the Colo s org dot everyone org that you okay data society Research Institute at this URL AI now Institute and project include which really does a good job of setting forth some ways that you can be more actively inclusive systemic lis inclusive in the hiring practices of some of these tech organizations in a couple of books the age of surveillance capitalism it should be mandatory to come to code that you have to have read this book and approved that you read it so maybe we can implement that for 2020 decolonizing wealth is not directly about tech but it is about the history of capitalism in the West and the absolute exploitation of indigenous peoples and what we've lost when we shove that entire group aside there are so many of them left and they have a lot to share that we can learn from I think there's a healing potential in all of this finally The Verge their better world theories they didn't pay me to say this didn't pay me at all but it's a really good vision of what the world could be like I think of it as sort of an inverse black mirror we're very good at scaring ourselves about everything that's wrong these are some really rich portraits of everything that could be right if we choose different paths today I never I'm on a stage without making mention of the climate crisis and what we can do about it so also encourage you to read this book draw down the most comprehensive plan ever to reverse climate change which we can absolutely do the key question is what kind of world do we want to be in we have the power to imagine it and we have the power to build it let's do all of that thank you very much well condos forever I embarrassing day you

3 thoughts on “Comedian and author Baratunde Thurston | Code 2019

Leave a Reply

Your email address will not be published. Required fields are marked *