Wicked Good Development Episode 16: Ted Neward's Philosophy 101

By

36 minute read time

Wicked Good Development is dedicated to the future of open source. This space is to learn about the latest in the developer community and talk shop with open source software innovators and experts in the industry.

What does philosophy have to do with software development? More than you might think. In this episode, hosts Kadi and Omar sit down with Technologist, Ted Neward and Developer Advocate, Steve Poole to discuss how philosophy is at the heart of everything. Ted provides great insight as to how his background has influenced his outlook on software development and why developers should be asking themselves the hard questions.

As Ted puts it, "It is the hard questions that are usually the good ones that will lead you to a positive outcome."

Listen to the Episode

 

Wicked Good Development is available wherever you find your podcasts. Visit our page on Spotify's anchor.fm

Show Notes

Hosts

  • Kadi Grigg - Host - (Twitter: @kadigrigg)

  • Omar Torres

Panelists

  • Ted Neward - Technologist, Philosopher, Co-Founder of Solidify @TedNeward

  • Steve Poole, Developer Advocate. Twitter: @spoole167

Transcript

Kadi Grigg 00:10
Hi, my name's Kadi Grigg, and welcome to another episode of Wicked Good Development, where we talk shop with OSS innovators, experts in the industry, and dig into what's really happening in the developer community.

Omar Torrres 00:20
Hola. My name is Omar, and I'll be your cohost today. We hope to bring you up to speed on the latest in the community through our conversations.

Kadi Grigg 00:28
In today's episode, we're featuring Ted Neward. So Ted, before we jump into the conversation, Steve will get to you as well, but Ted, can you give us a little bit of a background on who you are and why you're here today?

Ted Neward 00:41
Well my name's Ted Neward. I'm kind of a I call myself a computational philosopher. That's my sort of self-given title. I've been in the industry now for 30 years, been programming since 1978 with Apple II Plus with 48K of RAM, and not one, but two floppy drives. That was hot setup back then, and I've spent a lot of time as architect, as instructor, as CTO of startups. I've worked at companies as large as Rocket Mortgage and all of it I've done with a degree in international relations.

Kadi Grigg 01:22
Just shows the world you can do anything with that degree.

Ted Neward 01:24
It's literally the degree that takes pretty much all the liberal arts and smashes them together with the possible exception of art history. We didn't really study much in the way of, of like Donna and, and all. So, but just about every other liberal arts degree, economics sociology, philosophy even a tiny bit of archeology, political science history, all of that mashed together. And I mean, I took some fun classes in college, including one that was the political science of national conflict. The final in that class consisted of one question, two words, why war? I had fun in college. I had so much fun in college.

Kadi Grigg 02:16
The possible responses are endless there. I can only imagine.

Omar Torrres 02:20
Steve, can you, I know that we've had you on before, but can you introduce yourself again and maybe give a little spiel as to why you're here today?

Steve Poole 02:29
Yeah. Okay. So my name is Steve Poole. I am, I obviously spare wheel here. I am a DevRel person at Sonatype. You've been doing Java and things like that for just as well. Not quite as long, Ted, I think, but pretty close. And what can I say? I just like helping people be successful. That's always been the thing I care about and whatever it takes, however, we can help developers move forward and be just a bit more successful. Cause I've always, in the, for me, the enjoyment has always come when you help somebody move forward, help and solve a problem, learn something new. That's the fun. And I did that as a working on products and now I do that as a DevRel and it's just the best fun ever. How do you get to conferences and break fingers and things like that?

Ted Neward 03:16
The funny thing, Steve, is that DevRel in many ways is a lot similar to international relations because it's a Smorgasboard of all the different things. It's a little engineering, it's a little sales, it's a little marketing. It's a little education. It's all these different things kind of brought together into a collective whole. It's part of the reasons why I like hanging out with developer advocates because you often get a very, very broad and rich collection of conversations, but also experience. I mean your average DevRel team is made up of a, of a much more deeply diverse collection of people than what you get with your average engineering.

Steve Poole 04:05
Team. Yeah. And we're all opinionated and vocal because that's part of the job as well.

Ted Neward 04:10
That's the best part. Being all opinionated and stuff.

Steve Poole 04:14
Yes, yes. I mean, the last podcast we had which was about what was happening at Devoxx Poland, whichever we can go watch there were some opinions pressed, expressed because, and fun stuff because we enjoy this. I mean, people, it's a hard job, but at the end of the day, it's such a fun thing to do.

Ted Neward 04:34
Yeah.

Steve Poole 04:35
We can't recommend it to anybody. And by the way, we're hiring Got that bit outta way.

Kadi Grigg 04:39
So let's dive in. Ted, you, you kind of walked us through some of those different models in the beginning when you were just explaining a little bit of your background and your interest in this. But I wanna understand, how did you transition from having a degree in international relations to then all of a sudden having a computer science background and giving all these key notes about different things across the globe? I mean, that's a, that's a major shift. So how did that come about?

Ted Neward 05:05
Kadi I haven't had that question in probably 20 years when I used to get it all the time in interviews. It was, it was not the easiest thing in the world to start, because I have, when I would go into interviews someplace would look at the resume, they would see graduated 1995, Bachelor of Arts in International Relations, and they would say, okay, why do you want a job as a programmer? How can you, I mean, many specs, what they were asking is how can you prove to us that you're a developer when in fact you have no academic credit to speak of? And while I was in college, I took a couple of CS courses, mostly just crashed them because I wanted to see when I was in college, about a year and a half, two years before I graduated I and a buddy were sitting down looking through the paper they want ads looking for jobs.

Ted Neward 06:02
And one roommate was actually the son of an IBM research fellow. And so his career path was pretty clear. He was gonna be a programmer, and he just wanted to get out of school with the degree that required the fewest amount of credits, because, which as it turned out, was comparative literature. And his comment was, look, I've talked with the people with my dad's coworkers at IBM, and at the end of the day, people in the industry don't care what your degree is in as long as you have a degree. Now, this was the 1993-94 timeframe. I'm not saying that that's true today. And so he just won, he got his comp degree and left, and yeah, I think went into educational software games and whatnot. But while we were in school, the other thing you said is that people want to see that you can write code.

Ted Neward 06:53
And so the three of us were actually working on a board game, a war game, actually, it's, the game was called Supremacy. For anybody who remembers this from the nineties, it's Steve's nodding vigorously. It's basically Risk plus economics and nuclear weapons. And it's for up to six players. And we had a hell of a time trying to get six people to show up at the apartment all at the same time, mentioned we were in college. And so we had this idea that we would build this, this game, and then we would run like five telephone lines into the apartment and everybody would be able to dial in. Cause again, remember this was, this was 93', 94' timeframe. We didn't have this ubiquitous WiFi that all you Gen Zers back in my day.

Kadi Grigg 07:43
Is this a LAN party?

Ted Neward 07:45
No, this was even, this was before LAN Party.

Ted Neward 07:48
This is, this is Steve's gonna love this. This was a BBS.

Steve Poole 07:51
Yes.

Ted Neward 07:52
You what that is, Kadi? You even know? You have no idea.

Kadi Grigg 07:55
I have no idea. No.

Ted Neward 07:57
In boards system. Go Google it, look it up. It's in your history textbook somewhere. Anyway, we're building this, this game, and it was Windows programming. And so I was helping build this, and when I took a couple of CS classes, I realized that I knew more about programming than the people who were graduating with the CS degree because I was doing it for fun. And they were doing just the assignments they needed to do to get the degree and then would graduate. And that was a really interesting revelation to me. The other thing, as I said this, the roommate and I were looking for jobs, and he saw a position that was looking for a Windows developer to work on medical billing software for doctors rates and so forth.

Ted Neward 08:43
And he's like, Oh dude, you should apply for this. I'm like, Come on man, they're looking for professional programmers. He's like, Come on, what could, what have to lose and I am like I don't know. And then guys, and Steve, Omar, you'll, you'll know what I'm talking about when I say this. The heinous thing that he did next, he double dog dared me to interview. Now, now, now, Kadi, I don't know if this is true among, college age women, but among college age men, a double dog dare is, I mean, this, this is, this is threatening everything that you hold dear. I mean, this is your manhood, which is at stake in so many ways. And so I interviewed and out of like 40 applicants, I was one of like three finalists, which just blew me away. It absolutely stunned me that I was being seriously considered for this position.

Ted Neward 09:42
I ended up not getting it because I'm not a professional developer. And they could spot that fairly quickly. But the fact that I got that far on just what I had self-taught over the last couple of years in C++ and whatnot, and again, just as a contextual note back then, C++ was the hot thing. I mean, it was 93', it w- yeah, it was the sexy thing. And very much like Java in 96, very much like C# in 2002, 2003. The fact that I knew anything about it was intriguing to folks. And it really sort of brought home the idea that yeah, if you could write code, people wanted to hire you, period, full stop, regardless of what your background was in.

Ted Neward 10:33
And that really was what set me on this entirely different arc. Because prior to that I was seriously contemplating a degree, or I'm sorry, a career in like either the intelligence space working for the CIA or the DIA as an analyst or the Ambassadorial corp. I mean think about that listing audience, Ted Neward sitting there sipping mai thais with the Prime Minister of Australia. We'd have been at war in like two weeks. You, do you think I'd be any better than Boris? Come on, man, give me your vote. We'll get to that later.

Omar Torrres 11:12
Ted, could you talk about your evolution from the story you just told to what you said earlier? I think maybe before we even started the call about you being a computational philosopher, what does that mean to you? Can you say a little more about that?

Ted Neward 11:26
Philosophy is a really interesting space. I have recently, in the last decade, decade and a half really rediscovered or discovered, I should say. Cause back in college philosophy was just really annoying. Because philosophers ask all these really annoying questions like prove you exist without using any of your senses. Or and just, questions like that would just, would just drive people nuts. And that's really what we knew of philosophers. That, and they were old Greek dudes running around in togas doing bizarre things. Trying to figure out if they had if you wanted to move the earth, how long of a lever would you need? Doing just really bizarre things. And what I've discovered in the three decades since is that philosophy is really at the root of everything. And I can prove it, I can prove it to you because it's called the Wikipedia philosophy game. Have you ever heard of it?

Kadi Grigg 12:32
No, but I'm intrigued.

Ted Neward 12:33
Take any subject, any person, any topic, anything that's in Wikipedia, go to that page, click on the first link. Then from that page, click on the first link. Then from that page, click on the first link. And if you do this often enough, eventually you will end up at philosophy. Because if we can prove it, it became a science. And if we can't prove it, it became a religion. Philosophy is really at the heart of everything. Aristotle remembers was the first natural philosopher and much of what he studied turned into the sciences as we know it. And philosophy in many cases. I mean, the word means love of learning, love of knowledge. Depending on how you wanna translate the Greece of the Greek. And as I got deeper into development, particularly as people started asking me for higher and higher level opinions, particularly when we would get down to the subject of doing keynotes and whatnot, it really began to become more of a philosophical inquiry.

Ted Neward 13:42
And many of the tools that I used back in college to study things like why nations go to war, the creating models, not models in the sense of software models as we think of them, but models in terms of models of thinking. When we start thinking about object oriented programming and then we start thinking about functional programming, for example, those are really just two models of how to think about code. And so if you want to analyze the models, you are essentially now engaging in the act of philosophy. You're trying to ask certain questions. What does functional programming do well? And what does object programming do poorly or vice versa? Because functional programming is not the be all end all that some of the functional folks want it to be. Particularly if you're trying to deal with mutable state, and frankly, the world is mutable.

Ted Neward 14:34
I mean we can, we can get into some really interesting debates about data. The notion of data. Does data ever change or is it simply that the tupple that we're dealing with around data, the fact that the four of us are on this call at this moment in time, that will be me immutable for that moment in time, but the next moment in time that may actually be a different set of data. Does that mean that the data changed or that we're just simply constantly aggregating new sets of data? And if that's the case, then this really gets to the whole notion of event streaming. The databases really shouldn't be mutable, but should in fact capture the state of the world at any given moment in time. And that we should never delete data because at that time the data was correct.

Ted Neward 15:22
But then we also run into scenarios where in fact, what we discover is that under our understanding of the data was incorrect. Cause remember, the Greeks used to have this belief that there were certain things that were the smallest you could get. Those were what they called atoms. That's why we used that term in modern science. And that everything was made up of four elements, Earth, air, fire, and water. Well, okay, today our knowledge is different. It's better. So again, if we're trying to model knowledge in a database, does that mean we wanna go back and change that data? Or do we wanna represent what we incorrectly understood back then? These are all philosophical questions. This is all a notion of do we want our database to represent truth or do we want our database to represent our understanding of the truth? And we are right back to wearing togas out there in Athens arguing with one another or hiring sophists to do it for us. That's where we get the words office tree. They were the, they were literally the first lawyers you could pay somebody to go and argue your point in the square, which was the Greek form of democracy. Those were the sophists.

Steve Poole 16:34
It sounds like you've just labeled blockchain as a philosophy.

Ted Neward 16:38
Don't get me started on blockchain, Steve, just don't, because you'll have to.

Steve Poole 16:45
But then everything…

Ted Neward 16:46
Well, see the thing is philosophically Blockchain, blockchain assumes that they can be the source of truth. And this gets into some of that, that interesting philosophical notion of is there such a thing as objective truth. I and a college buddy of mine and our medieval studies professor, literally we were the only three in the medieval studies lab one day. And because nobody else showed up, and we three spent, we spent the entire time plus another hour at the coffee house on campus arguing over whether or not there was a notion of universal truth. Is there in fact a notion of universal truth or is in fact all truth merely my and your perception of it? And therein lies some of the roots of conflict. I mean, we can start to look at some of the current scenarios, but even even questions of physics.

Ted Neward 17:46
Is the speed of light at constant? No, Einstein's later theories proved that light in fact was potentially relative in certain scenarios that parallel lines could in fact end up intersecting. And when you can't even trust physics to be true 100% of the time, then can we really argue that anything has universal truth? This is what philosophers do. And that's why if we can prove it, it becomes a science. If we can't, it becomes a religion. I mean the flat earthers right now are a religion, whether they wanna admit it or not, they're a religion. And if they could prove it, then they I mean, I, that's the thing. Can you tell a flat earth or define the edge of the planet and look down? And until they can do that, it remains a religion. Computational philosophy, well, just let me, the other half of this is computational philosophy. How do we apply philosophy to computing, to programming, to all the things that we four of us make our money around? And that's the heart of the term.

Kadi Grigg 18:55
We were speaking with a senior security researcher yesterday Ax Sharma, who was reporting about protestware So that's something new, I think, in the development community and something I personally have never seen of or heard of until yesterday. That kind of boggled my brain a little bit, where developers are now kind of almost self sabotaging, but putting up separate messages about get better news information, et cetera. What are your thoughts on these types of activities?

Ted Neward 19:22
So I'm not familiar with the term protestware either, so if I start, if I start making assumptions in this that are incorrect based on the conversation you had yesterday, stop me. Just so you know, you can remain at least as accurate as possible. But I mean, part of the thing that and this is where, this is where we're gonna dive a little bit into some of the economics during the great resignation a year or two ago, and even before then with some of the efforts around unionization at various companies. We're seeing unions in many respects. Starbucks is looking at union efforts. Trader Joe's is looking at union efforts and Google has been staring down some union efforts. And this is something that the, the programming space has talked about for quite a while. Originally the conversation was we should have some sort of governing body so that not just anybody can call themselves programmers, but only people who are in fact trained as engineers.

Ted Neward 20:25
In many other spaces, I can't just put on a white lab coat and call myself a doctor. There are certain exams and certain things that have to be proven in order for me to make that assertion. Whereas anybody today can call themselves a software engineer and we should change that. The unionizing effort now though, is much more about the individual developers, programmers asserting their own power in many respects. And so, for example, there was an effort at Microsoft to resist the fact that some Microsoft software was being used to help INS here in the US and many, many people on the left side of the political spectrum despise INS in many cases, because they don't really seem to have a need to follow our concept of civil rights. They can detain you for any reason. They can deny you your one phone call.

Ted Neward 21:21
If you live within 50 miles of the border, any border here in the US, the INS can absolutely set aside. And so many people at Microsoft, many people at Google and some other companies were saying, No, man, that's crap. And I will not as an individual allow my work to support a cause that I do not support. Now, that in itself raises some really interesting questions because if I'm paid for the work, do I still have ownership? Do I still have any associativity to the work? And if I do, then that raises some interesting intellectual property concerns, especially around things like legal liability. If I do not, then technically the only option I really have as a worker is to quit and not work for that company anymore. Which to some degree that's, that argues partly in favor of the union efforts because that was the same argument that they made 150 years ago around, well, if I don't want to lose a finger in the in the machine, my only option is to quit.

Ted Neward 22:31
Well, unions were designed to protect people's physical safety and allow them to have a comfortable place in which to work and earn a living and so on and so forth. But now we can make this, and I'm squinting pretty hard here. To say okay, mental health, we've spent a lot of years just in the most recent times talking about mental health and the importance of mental health and work life balance. And if working for a company is gonna lop off pieces of my soul, and if we value mental health, doesn't that mean that we should be thinking about these same kinds of, of efforts. And so does that mean that Microsoft has to check in with every single one of their developers every time they wanna sell software to somebody? I mean, if I'm a Democrat, Microsoft wants to sell Microsoft Word to the Republicans, do I get to veto that sale?

Ted Neward 23:30
There's a really interesting balance of power that goes on around some of this. And I remember not too long ago when we really thought the internet was gonna be this great democratizing force in social media and the Arab Spring and all of that stuff. And then we discovered that social media was actually being weaponized in several ways. And all of a sudden, so I mean, social media went from, Oh my God, this is the most amazing thing ever. It's gonna democratize the universe to, oh my god, social media is the weapon of tyrants everywhere it's gonna protest. I mean, this is one of the interesting things and the frustrating things about philosophers is I didn't answer your question because I don't know the answer to that question because the questions in some cases are more interesting than the answers. I think the really interesting question is how do each of us feel about protestware? And what does that say about our relationship to our employers, to our fellow man, to our fellow developers et cetera. Because the answers to those questions in some cases are going to be the long ones that will decide whether or not the industry is shifting fundamentally, or whether or not this is just yet another interesting footnote in the history of computer science.

Steve Poole 24:53
Yeah. So I think the particular thing we're talking about here, process where is, so the, the specifics are they, you are using a piece of code and you've been, dependency, been running forever, and one day it doesn't do what you thought anymore. And it does something not necessarily benign, but something like put up a protest or in some cases they're saying specifically to do a Ukraine, it was about trying to give people in Russia maybe access to information they weren't normally access to. So it's not malware, but it's unexpected and possibly unwanted side effects. And so the interesting thing is how this is becoming, it gets to an under, I dunno, a sort of a common understanding that we've all had about open source, which was that it was for the good of everybody. And so whenever you have an instance where somebody changes their code for bad practices, bad reasons like malware, or actually in this case for protestware suddenly it's like you are, you are, you have to check your assumptions about how and why you're using open source because that idea that it was there forever and you could use it forever, always for good. Now it's like, well, no, not so back, not so much now.

Ted Neward 26:11
Well see, here's the funny thing about open source, Steve, is that in many respects we are, there's, there's so many different threads that are, that are firing inside my head right now. They're battling for control of the utex. That is my mouth a open source. We don't really think about open source the way that original advocates of open source thought about it, because realistically speaking, if you don't like the protest, you can, you can fork a copy of the open source and you can now support it yourself. That was always the model. That's why it would be free forever, because I could see the code, I could get my own copy of the code. That's not the way that most of us interact with open source, though most of us treat open source as a product, not as an open source project.

Ted Neward 27:05
And now this starts to get into some interesting assumptions. There's, there's some interesting assumptions that are being turned on their head when I say that. If I say open source is a product there are certain things that we make assumptions about with respect to the product. We do not expect Microsoft Word to flash up alava Ukraine when I start Microsoft Word as a splash screen. But if they did how many of us would be upset or how many of us would just say, okay, whatever, and move on. Do they have a right to put that message up? There were a lot of people who will argue, they absolutely do not, and that would be an interesting one to try in the courts. But the other thing is much of the open source space, what a lot of people really don't think about is something that economists frequently do.

Ted Neward 27:58
And this used to be that five years ago, if I said the word supply chain, most people would have no idea what I was referring to. Now, of course worldwide with the disruption pandemic, the shortage of labor, et cetera, everybody's getting comfortable with it. But open source is part of the supply chain of your software. And one of the things that frequently, I mentioned earlier McDonald's, one of the things that made McDonald's successful as this global enterprise is they took control of their supply chain. They make their own buns. They raise their own cows. They farm their own lettuce and tomatoes and so forth. They, when you drive out of McDonald's with that machine assembled burger that you've got every aspect of what you're about to eat, including the cardboard packaging, the Keenan came from suppliers that Microsoft, or sorry, Microsoft McDonald's owns.

Ted Neward 28:57
They own their entire supply chain. And it's funny because it just yesterday there was a Twitter that came across where somebody's cart was, it was a comic and two people in hard hats obviously working on some construction site. And there was a picture of a screwdriver with a $2 price tag on it. And the one hard hat is looking at the other hard hat and saying, really, you bought the screwdriver instead of making it yourself. What kind? Because we do that all the time as developers, why do you insist on building everything? Well, I mean, there is an argument to be made here that if you build it, it's a part of your supply chain and therefore you have full control over it and you would never be victims of the protest world. The other side of this is, if I depend on a particular package and they decide to put up protestware, and especially if they decide to do it in a public manner, am I legally liable?

Ted Neward 29:51
Forget lava Ukraine for a second. Let's suppose they put up a neo-Nazi flag. Am I legally liable for displaying? Because I mean, anytime we talk about something that could be used for good, it can also be used for bad. We have to acknowledge that. Am I legally liable now for antisemitic messaging from my software? Yeah. These are the kinds of things, and this is partly why so many companies for, I mean, developers get so frustrated all the time when it's like, why can't I just use an open source package? Because that's an additional element to the supply chain that we need to make sure that we understand and that we understand the legal liability implications of using, because that is a big deal. That is a very, very big deal.

Steve Poole 30:40
Yeah, I mean, you touched there on that shows that some of this, that the line between protestware and

Ted Neward 30:48
Malware, it's a very, very thin line, has more to do with the intent of the person writing the wear than it does with the actual wear itself. Because again, and this is something personally my politics are left of center, but this is something that most NRA gun rights activists will tell you it's guns don't kill people. People kill people. That's true. Because it really is, the gun is just a tool. And believe me, I definitely wanna see more guns off the streets of the United States and will probably lose a few listeners from me saying that. But the point is that any, anything we build, anything we do if I create a knife, I can use it to cut my steak during dinner and I can use it to stab somebody.

Ted Neward 31:37
Anything can really be used for both benign and malicious purposes. And so, in many respects when we talk about viruses, I mean another word for virus is self-modifying code. And another word for self self modifying code is lisp. This is a tool that can be used for both good and bad things. I mean another name for the virus is production patch. We change the running software and keep it going. And this is part of the reason why personally, I think as an industry we need to stop spending so much time focusing on churning out all these candidates who know how to do big o notation around algorithms and what, and start churning out people who have a deeper background in philosophy. My personal preference, the United States would pass a law that philosophy is taught in kindergarten because children are the world's most natural philosophers. If you've ever been around a five year old you know it because they just keep saying why, why, why? And that is exactly what philosophers do. It's why they're so annoying.

Kadi Grigg 32:48
I told my sister I wouldn't pick up her kid anymore from daycare because of that.

Ted Neward 32:52
But Katie, that's the frustrating thing is we shut that down. As parents, we shut that down because we don't have the answers to those questions. And we think as parents that if we don't have the answers, we're somehow failing our children.

Kadi Grigg 33:07
Yeah, that's true. And the thing is, if you admit you don't know, you're not that you're not far off. And I think it's okay to admit you don't know. Yes.

Ted Neward 33:15
And that's the thing is admitting that we don't know is absolutely the first step to knowledge. I only know that I know nothing Socrates said.

Steve Poole 33:26
That. Yeah. So Katie, you've gotta think about it. It's not a, it's not a, you've, you are missing the opportunity to spend time with a five year old philosopher.

Kadi Grigg 33:33
It's True.

Ted Neward 33:35
That's the thing that I regret in many respects. My kids are 29 and 22. And I regret that in many cases I did that parental thing of shutting them down when they were young and asking those questions because that, we do stifle children's natural curiosity about things. Now in some cases we do it because we're trying to keep them alive. No, don't touch that. That hot cookie sheet. But I actually, my eldest, blogged about this a long, long time ago and a long since defunct blog, my eldest was like five or six and we were taking cookies out of the oven and, he wanted cookies. So he was reaching and it's like, no, don't touch that. It's hot. And he was reaching, don't touch that. And the third time it's like, okay.

Ted Neward 34:27
And he touched it and burned his finger. Just a little blister tight burn. And I blocked about the fact that, this is what parents refer to as natural consequences. We have a different acronym for it today. FAFO I think is how, how it goes. And I got a number of people who are like, Oh my God, you're a horrible parent for allowing your child to burn himself. That is one of the ways we learn is through natural consequences. You can't be around your kid all the time, etcetera. This actually goes back to again, goes back to philosophy. There are two kinds of knowledge really that we have. One is knowledge that we've accumulated from our own experience.

Ted Neward 35:18
I touched the cookie sheet, I burned my finger. Other is knowledge that we've acquired from other people. I personally have never taken crack heroin because I assume it's bad. I didn't have to experience it in order to discover how terrible it's, and children, we want to allow them to learn from experience as long as the experience doesn't hurt them too badly. But if you think about it, and this goes back to Katie, when you and I were in crack of right at the conference, conferences are an attempt to try to teach people through our experiences as speakers so that they don't have to burn themselves on a cookie sheet. And yet in some cases we try to replicate the experience by showing them demos. Which, is that really replicating the experience? Or is that in fact just saying, Look, look, there's a pothole here, I can show you the pothole. Or it's just, I mean this is what I mean. Philosophy underlies everything. It's everywhere.

Omar Torrres 36:26
Is there anywhere right now in software development where we should be applying more philosophical thinking? And I'm just, my question stems from, is there anywhere in software development where we are heading towards one of those natural consequences? Or maybe we should be trying to watch out. Or maybe some people are saying, watch out, but we're waiting for that natural consequence before we have a reaction.

Ted Neward 36:54
So Omar, let me ask you a question cuz that's what us philosophers do. We ask questions.

Omar Torrres 37:00
Ok, yes.

Ted Neward 37:01
You get in your car, you drive down the road and you see a bicyclist in front of you and all of a sudden something happens and the bicyclist goes down and the figure, the bicyclist, the rider is right there in the lane of traffic and you can choose to swerve left. Cause if you keep going, you'll kill them. Ok. You'll just run them right over. You're traveling fast enough. If you swerve left though, there is a strong chance you'll create a five car pile up, which potentially could be fatal. You can't tell what's your choice. This is known as the trolley problem.

Kadi Grigg 37:40
I always struggle with this one. Even since college, the trolley problem is brutal.

Ted Neward 37:44
Okay? But, but here, here's where things get even more interesting. You are now the developer writing the software for the self-driving automobile.

Ted Neward 37:57
And you are faced with that choice. Which do you choose? And just for bonus points, are you in fact legally liable if you write the codes to run the bicyclist over accepting that one certain fatality is better than potentially five fatalities? Either way you go. Are you liable? Are you the developer of the software? Cause you are making that decision ahead of time that this computer will just faithfully execute. You can make the argument in the heat of the moment that you didn't have time to really sit down and analyze the problem. Even the most intricate analysis of the trolley problem loses side of the fact that you have to make a split second decision. But as a developer, we don't, we actually get to decide that long in advance. So a if you write the code to do, okay, I'm gonna go with the known as opposed to the unknown, does that make you legally liable?

Ted Neward 38:59
And does that make you a murderer?

Omar Torrres 39:01
That's a great question.

Ted Neward 39:02
If you're getting the impression that I'm thinking that AI and machine learning desperately need ethical oversight, you would be right.

Omar Torrres 39:09
That's right.

Ted Neward 39:10
That's the, that's the most obvious case.

Omar Torrres 39:13
Okay.

Ted Neward 39:14
That is, and when I was at Rock Mortgage one of the teams that I was leading was working on a problem in AI and machine learning. And we actually consulted with a team inside of Rock Mortgage specifically geared towards thinking about morals and ethics of the data science and machine learning related. Because if no other reason, Amazon has gotten into trouble recently, they created a system that would do analysis of resumes to determine whether or not they should proceed forward with a candidate based on the resume. And do you know what Amazon discovered that the best chances of getting hired at Amazon were for those people who were white males? Katie's eyebrows went and hit the ceiling.

Kadi Grigg 40:06
Wow. I wasn't expecting that.

Ted Neward 40:08
Well, because here's the thing, when you do machine learning, it's a giant pattern recognition machine. It was recognizing what was already true at Amazon. It had nothing to do with skills. It had to do with the fact that most of the developers at Amazon are white males. And so the machine learning said, Oh, well, given that that's what leads to success, then this is clearly the best. And Amazon very quickly shut that program down because that's not what they want to do. But that's the danger of machine learning, is it can tell us the way things are now, but it can't necessarily infer the things the way we would like them to be. And so I think it's great for image recognition. Here are all these images of wiener dogs and here are images of hot dogs, and we can learn the diff the subtle differences between the two and eventually get to a point where we can create an algorithm that recognizes wiener dogs as opposed to hot dogs.

Ted Neward 41:06
But it can't then look at a doberman and say, Oh, that's a dog too. See, this gets back to some of the notion of what we do for models, because all models are broken, but some of them are useful. How do we validate that the model we are building is not in fact, broken so badly that it's not useful? And so software and machine learning can give us insights that will get to conclusions we've already raised, but they can't necessarily get us to new conclusions. That's why I actually hate the phrase artificial intelligence, because even if we could get a machine to assume classic science fiction trope, even if we could get a machine to be fully sentient, and by the way, let's, we'll have to debate what that term sentient really means. Do you have to feel emotions in order to be alive?

Ted Neward 41:59
This is something that we've debated in both the philosophical and the religious space. Do animals have souls? That was a major debate in religious circles about, a couple hundred years ago. But we know that animals can feel things, but what about plants? I mean, that's one of the reasons vegetarians don't eat animals because they say, I don't want to eat anything that can feel pain. Hate to tell you this, plants can feel pain. We have actually discovered through deep analysis of some of the electrical activity in a plant, they can feel pain. So does that mean we stop eating plants too? I mean, what do we end up eating soil and green for everybody? And this is.

Ted Neward 42:44
So go back to your original question, Omar. The whole machine learning space is definitely a place that needs ethical oversight. It needs philosophical inquiry. I think most things in software require some philosophical underpinning. And I think the best thing that software developers can do is learn elements of philosophy, but it's also gonna be the most frustrating thing because once you start down this path, it really, really, it really begins to call into question so many things and you really start looking at the world differently. It's a drug. And once you take it, you never look at the world the same way. You really don't.

Omar Torrres 43:25
Steve, do you have any other questions? And I'm only looking at time. I actually really enjoy this conversation.

Steve Poole 43:32
No, I, we can go forever. But I thought Ted's touching on is, the thing that gets me is when we started way back with software it was sort of contained and now the world runs on relied software across the board, everything easy. Software and developers haven't really grown up to that. And that is a, that is a challenge. I mean, we see that, we talked about protestware, but that this idea of I own a piece of software and whatever it does is up to me, that was okay when you had five people using it, but when it could be embedded in anywhere. And the same with all this sort of stuff. It's this all run ahead of us without, I don't think moral, but some, some education to people who write code about that they have responsibilities and they need to think more about what they're, what they're writing. That comes out in the, when we talk about security vulnerabilities, we see how often good intentions get subverted into bad things. Because people were going, Oh, no will ever, nobody will ever use this for bad purposes. And nowadays, if you let a bad guy use it, they can get to it, they'll subvert it. That's life.

Ted Neward 44:46
Well, and the, and the thing is, Steve, it's, this is where history comes into play because it's not like this is the first time we've ever had this particular dilemma. Remember Robert Oppenheimer actually, as soon as they discovered that Germany was not in fact working on a nuclear weapon, and as soon as we had in fact taken the heavy water plants that Germany would've relied on, he wanted to stop the development of the atomic weapon. And the US Army said no. And one of the things that I found personally fascinating when I was at UC-Davis, one of the classes we had was Physics 137, the physics of nuclear weapons. And it was taught by two grizzled old physicists, one of whom had actually been at White Sands New Mexico, working on the project. And he said that the day they set off the first weapon, there was actually a betting pool going among the scientists as to whether or not the weapon would work.

Ted Neward 45:48
And then he paused and he said, actually, there were three categories. About half of the physicists thought it would work as expected. Another basically two thirds of the half remaining thought that it wouldn't work. And there was a vocal contingent that thought that once the reaction started, it would never stop that. In fact, it would just keep creating chain reactions and turn the entire earth into a nuclear fireball. The earth would basically turn into another sun. Yeah, you could have heard a pin drop in the classroom until a young woman at the very front said, Then why did you let them set it off? And he'd obviously gotten that question before because he just looked at her and said, Young lady, this was a project run by the United States Army. We were not in control. It was not our choice. The army wanted their weapon.

Steve Poole 46:45
Yep.

Ted Neward 46:45
And nuclear power is one of the most powerful things that we have on this planet. We can use it to create vast amounts of energy in a controlled manner. We can also use it to create a really, really big bomb. And every technological development that mankind has ever come to has in fact been something that we could use as either a boon to mankind or a weapon. I mean anything. Even if it's just I invent this really, really cool thing and now I weaponize it by taking it away from the people who need it. I mean, as we're seeing now with the grain and farming and whatnot. We've optimized the agricultural space so much that now Ukraine is under conflict and the fields are under attack, and that's disrupting the supply chain.

Ted Neward 47:38
And now there's a good, good chunk of the world, it's potentially gonna starve, because we've done all this optimization and now food is itself a weapon. It's always been a weapon, but we're seeing it play out here in the 21st century. This isn't a new problem, and it isn't necessarily one that we as software developers can solve. Cause that's the other great problem we have, is thinking that software can solve everything. I remember talking with a startup here in Seattle that was gonna solve the problem of corruption in India by creating a social media site where you could report corruption and therefore people could in fact, root out corrupt agents. And it's like, yeah. And how do you think that's not gonna be corrupted? Because the first thing I'm gonna do is I'm going to falsely report the one non-corrupt. I mean, bribes are frankly a fact of life in many of the governments and you could argue many western governments as well.

Ted Neward 48:37
It's just the bribes are different and squirreled away more. At the end of the day, there's all these things that we can do. Social media can be weaponized, but it can also be a tremendous power for good. The atomic weapon can be weaponized, obviously atomic power, but can also be used for a tremendous amount of good. This is not a technological problem. This is a human problem. And one of the things that doctor developers would be really, really well served is to understand that you cannot solve human problems with technology. It can only help humans solve human problems.

Omar Torrres 49:16
I only for the sake of timing. I actually really was gonna say, how can you close this out in terms of what, what can developers be thinking about or what should they be thinking about as they keep doing the work, keep inventing keep innovating what, would be some of the last words you could leave for developers?

Ted Neward 49:41
So understand that whatever you build will be used in ways that you didn't intend. That's true. Both from a philosophical perspective as well as from a security perspective. Anytime you build an API, remember that there are always two clients for it. One, the front end that you build, and the other tell net. There will always be attackers who will try to attack your API through something other than the front end that you build for it. Your software will be used in ways that you cannot imagine. And if you really have a concern for that, then you should build in safeguards such as, for example, encrypting data at rest. If you don't want data to be used potentially maliciously, and don't try to anticipate all the failure scenarios, don't try to anticipate how data will be used. You can't possibly imagine all the creative ways in which data could be used for malicious purposes unless you're an attacker yourself.

Ted Neward 50:42
You really just don't have the breadth of experience. Just accept that it's okay to say, I don't know, but you can still take steps to try to correct for that, like encrypting data at rest, like thinking about how the keys will be managed. Like wondering what happens if a system administrator decides to go rogue. How do we protect our systems against the CIS admin, which is an interesting question to ask. And yet, this is exactly what led us to system administrators can no longer see passwords in a properly managed environment. It's, by asking a number of these questions, by engaging in some of this philosophical inquiry, that you're going to come up with a lot of questions that don't have good answers or don't have answers at all. And that's okay. That's, that's part of philosophy, because then if you ask a question and realize there is no answer and realize that's a problem, now you are potentially doing something that nobody has done before at least successfully.

Ted Neward 51:45
Because if they had, there would be an answer, and now you could potentially be on the cusp of doing something really, really useful if you can figure out a way to, in fact, allow us to secure systems without requiring passwords. Hey, that's a great thing. And we are now starting to see a rise of passwords, less forms of authentication and security. If you can think of ways to try to prevent people from being able to do malicious things with software by requiring a second form of authentication. There's 2FA right there, so on and so forth. It begins with asking those questions and it begins with asking those questions of people that are not you and are not like you because you want a diverse perspective. One of the reasons why Amazon ran into trouble with their resume system is because they tested it on themselves and because their team was made up of white guys, the software worked great for white guys.

Ted Neward 52:49
Facial recognition software has this exact same problem. This is where diversity of teams really comes into play, particularly diversity of thought. If you really want to test whether or not your software is accessible, hire somebody who's heart of sight or colorblind or has some sort of motion disorder such that they can't really control the mouse or hire somebody onto your team to do that. If you really, really, if you're committed to doing, software that is accessible to everyone. As opposed to just, Hey, do we have these things? Go through the checklist, we're good. This notion of asking yourself, not how do I satisfy the requirements, but how do I actually what are we building? Who is it for? Why do they want it? Why do they care? Why should they care? These are hard questions to answer, but as philosophers will tell you, it's the hard questions that are usually the good ones, and they will lead to positive outcomes. So, it begins with learning some basics of philosophy and then asking hard questions. And that's about as concrete as I can get.

Omar Torrres 54:06
Excellent. Well, thank you so much for being on Ted. I feel like the conversation has been enlightening, at least to me. I've learned a lot, so it feels good. Steve, any final words?

Steve Poole 54:19
No, no. It's been, we could go on forever, but, so let's not.

Omar Torrres 54:23
Okay.

Kadi Grigg 54:26
Thanks for listening to another episode of Wicked Good Development, brought to you by Sonatype. This show was co-produced by Kadi Grigg and Omar Torres, and made possible in partnership with our collaborators. Let us know what you think, and leave us a review on Apple Podcast or Spotify. If you have any questions or comments, please feel free to to leave us a message If you think this was valuable content, share this episode with your friends. Till next time.

Picture of Kadi Grigg

Written by Kadi Grigg

Kadi is passionate about the DevOps / DevSecOps community since her days of working with COBOL development and Mainframe solutions. At Sonatype, she collaborates with developers and security researchers and hosts Wicked Good Development, a podcast about the future of open source. When she's not ...

Tags