Episode 17 - Spirit Animals and Defrag Wrap-up pt 1

GUESTS :

SHOW NOTES :

  • Mindful Cyborg Truth or Dare
  • Spirit Animals!
    • Lion
    • Dolphin
    • Hedgehog
    • Leapord
    • Tiger
    • Fox 
    • Raven 
    • Otter 
    • Crow 
    • Cardinal 
    • Greyhound 
    • Cheetah 
    • Wolfcat 
    • Monkey 
    • Spider
  • Themes for Defrag?
    • Need access to our data!
    • Employees disconnected to IT policy
  • Cluetrain manifesto 
  • Brad Feld on depression and the myths of the startup 
  • Self awareness
  • Buddha hacking
  • How can we use this in healthcare industry?
  • Building the perfect human-machine coexistence?
  • Two themes via Chris:
    • Tech and service to humanity
    • Tech and service to corporatists 
  • A void in the room with these two themes
  • Lots of data out there -- “you can now learn more about someone else than yourself”
  • “How much can you say publicly about yourself, about your opinion without it reflecting negatively about the corporation?”
  • Quantified work--anybody doing it?
  • Are we getting to a time where we will be worried about employers knowing our health data?
  • Figuring out technology, but are we figuring out how this will impact us as a society?
  • Chris has 12 terabytes of data (!!)
  • He dumped a bunch into Stanford’s Deep Learning Engine and watched it dance

WORD OF THE WEEK : 

void: completely empty

EVENTS : 

  • SXSW  - March 7-16, 2014 Austin, TX (POSSIBLY SEE KLINT AND CHRIS PRESENT) 
  • Cyborg Camp - MIT Media Lab - August 2014 - Boston, MA
  • Buddhist Geeks Conference - October 16-19 2014; Boulder, CO

THANK YOU / FIND US :

PREVIOUS SHOWS :

TRANSCRIPTION :

Welcome to Mindful Cyborgs, Episode 17, part 1 of the Defrag Conference wrap up.

CD: So, Mindful Cyborgs day dos, day tres?

LB: Day tres.

KF: Day 3 of Defrag. Day 2 of Mindful Cyborgs at Defrag because I don’t think we did anything on the first day. Did we?

CD: I have no idea. My spirit animals are in sync with what’s going on.

KF: I barely know who or what I am anymore.

CD: We actually have the de facto co-host of Mindful Cyborgs Alex Williams with us. How are you doing, Alex?

AW: I’m doing great. Press thank you.

CD: Is that the energy level we can expect for first show wrap up? I just need to know.

AW: I’m a cheetah.

CD: Yeah, you are a cheetah. I was going to bring that up. So, a couple of us got together last night as Mindful Cyborgs. We actually played Mindful Cyborgs truth or dare for a brief moment but I thought I’d lead with just the spirit animals that were in the room with us last night. Is that okay?

KF: Go for it.

AW: Yeah and you need to explain the context for the spirit animals.

CD: There was a get together on the side of a mountain and a bunch of people got together from the conference and speakers and presenters and things and people were kind of collapsed into little groups and I thought well, why aren’t they all kind of talking and sharing. So, asked everybody their spirit animal secretly and then we turned it into a game. So, these are the animals from last night. Then we’ll jump right into that post conference.

So, we have a lion, a dolphin, a hedgehog, a leopard, a tiger, a fox, a raven, an otter, a crow, a cardinal, a greyhound, a cheetah which Alex was the cheetah, a wolf, cat and a monkey. Post show wrap. Thoughts?

KF: How can I follow the list of spirit animals?

CD: Do you have a spirit animal?

KF: I don’t know. Maybe a spider. If I had one, it would be a spider but I’m not sure. I don’t know how you find out but . . .

CD: We just made them up.

KF: Oh okay.

CD: We’re Mindful Cyborgs. We can do that.

KF: Right. Okay.

CD: I’m adding spider to the list by the way. We’ll post this in the show notes.

KF: Okay. Cool. I noticed a few things that kept coming up again and again here at Defrag. One which was kind of the point of your talk but it came up in a bunch of other talks which is that we need access to our data. Another is the data aggregation which is related to meeting access to our data. There’s data all over the place but getting it all into one place so that we can correlate it or do anything with it is a really difficult problem and then almost every talk also brought up the fact that employees are more disconnected from official IT policy than ever.

Employees are just rampantly disobeying the rules at their jobs and they’re doing it because they need to get stuff done and IT departments just aren’t keeping up with the way the world works and so in a lot of ways I was sort of disappointed with some of the talks at the conference because it feels like we’re still stuck in 2010 or in 2009 talking about what’s going to be the next thing in enterprise cloud computing in APIs and stuff. We’ve known that for years now.

CD: Almost four.

KF: Yeah. Well, at least yeah.

CD: I mean, hardcore. Not the lockdown [00:03:49].

KF: Yeah, I mean we used to call it service oriented architecture back in the day and that goes back to 2003 or something probably even earlier. Your partner Doug was talking about how he’s a COBOL programmer but he doesn’t call what he does when he does APIs he calls it class modules or something like that.

CD: Yeah, so that goes back 40 years.

KF: Yeah. The Cluetrain Manifesto was published in 1999. Most people still have them. I haven’t really caught up to that.

CD: Some people actually need the Cluetram like the train does not slow. It’s too slow for them. Alex, you’re officially an enterprise person. What are your kind of takeaways?

AW: Well, I really find the conversations that are about concepts more interesting than the actual tech itself. That helps me kind of understand what people are talking about and I was really hit pretty strongly by what Brad Feld talked about and he focused on kind of his topic of depression but he also talked about it in the context of myth and the myth that’s being created around startups, the myth that’s being created around the people who lead startups, the myth about geographies and how you can only be a hero, a superhero if you live in San Francisco for instance and how that kind of like . . . it hurts us in a way because we create these personas and the reality is is that people are going through these daily struggles all the time and it’s very difficult. But that was part of what drove me to write this story. I’m just trying to figure it out so maybe we could talk about it but . . .

CD: We could write it on the air.

AW: The other aspect of what we’re seeing right now with the kind of work that you’re doing for instance and to me this event represents much more so something about the blurring of machines and humans and to me that’s kind of like best stated as kind of like this new concept of a human API, right? And how we . . .

CD: Or a mindful cyborg.

AW: Or a mindful cyborg. There you go. So, depression is still a stigma but in the startup community does stuff probably just like everyone else does. There’s this other side of it saying well we’re trying to cut them better understand ourselves and our relationship to data more so than ever before. So, what are going to be the preventative ways that we can stay healthy in our minds and in our bodies using these new capabilities.

CD: And we’re joined with a guest today, please introduce yourself.

LB: Yeah. I’m Lorinda Brandon. This is my first Defrag actually. I was here for Glue. I find it very interesting to kind of balance the two. Similar feeling conferences but very different in some ways. And Chris and I actually were just having a conversation similar to what you were just talking about Alex and I think that some of what we are uncovering or exploring right now is this whole concept of how much we can actually know about our environment and our reactions to people and things and then create an environment that keeps us happy and I find if you wrap that together with Brad Feld’s talk what’s really fascinating is how can we treat or manage or avoid depression by actually creating an environment . . .

CD: I call it preconditioned environment.

LB: Yeah, preconditioned environments exactly. You know that you’re happiest in this kind of environment. Chris is like so desperately aware of everybody around him.

CD: Desperately aware.

LB: Desperately aware - you can use that - of everybody around him and his reactions to them so you can kind of - you’ve got this self-awareness so quickly and all the time.

CD: Buddha hacking.

LB: Buddha hacking. You have more great phrases.

CD: Thanks.

LB: So, how can we use that in our healthcare industry without bumping into . . . the thing I struggle with as I hear all of these things I get so excited with every new speaker here because I start to piece together with the previous speaker. How can we put all of these things together and build like the ultimate human machine environment where all happily coexisting. But then how do you cross that line. What is a medical device, what should be regulated? Do you create an environment where people want so badly to control the environment around them that they never leave their house?

I think there’s some level of psychology and regulation that we have to figure out before we can take that next step.

CD: What I noticed I mean, my big takeaway of the conference I was going to visit the agenda and try to read it but then I thought there’s two themes for me is what I heard. There is the tech and service to humanity and there’s tech and service to the corporatist, every talk of tech and service to humanity or tech and service to corporatist. I’m very passionate. Sometimes I go crazy on the show about this whole corporatism world we live in and I benefit from it because I’m paid by a corporation and we all work in corporations but there are two.

I don’t know where those two will meet but it’s amazing in so many ways to have lunch with Alex and to hang out with Klint and do a couple of these live and to talk about all these amazing speakers like your whole session on women in tech. I just realized we are having two conversations. How do we continue to feed ourselves and do we put our tech to work for us, by us or do we put our work to tech to work for someone else that can make money so we continue to wish that we did it for ourselves. I don’t know why that seems so disconnected but it feels like there’s a void in the room.

LB: Well, if I can jump on that.

AW: What do you mean by a void?

CD: I didn’t hear any presentation . . . Brad Feld’s presentation was so distant from say the Plantronics’ CTO, right? One was very, very human, very connected and then you have the ones in the middle that were kind of human tech combined, ones that were just kind of life could be better but had a corporatist message around it but there were just a void. Some people saw this consistent theme. I saw this consistent dichotomy of no, you’re on that side you’re on that side but that’s just my vision of it.

LB: Yeah, and I think maybe that’s what I was trying to talk through is some . . . there are definitely gaps between what we can do for ourselves personally like a lot of the stuff that you’ve been doing, Chris, how do we take that. If you take all of that and you turn it into a product how does that actually work and that’s kind of what I was referring to is what space does that go into when it’s no longer in the personal space and the other part of that is as we start to become so aware of ourselves but also so public about all of that, we can look . . . there’s a lot of people who are posting all kinds of personal data all over the place.

You can know more about somebody else than you do about yourself right now depending on how wired you are.

CD: Actually, that’s what has happened.

LB: Yes.

CD: That’s not you could. That is what has happened.

LB: It is. And so I think anybody who’s worked for a large corporation knows that there is a portion of that that spills into your personal life. How much can you say publicly about yourself, about your opinion without having it reflect back on the corporation? So, I think you’re right there’s a gap there that we haven’t figured out is how does all of this play in and at what point can a corporation say the fact that your anxiety level goes up when you walk into your office space. They don’t really want you posting that on the web.

At what point do we span that business and personal?

CD: Couple of articles like quantified work and the possible dark sides and light sides do you think there’s more conversation now about quantified work? Even more now or?

KF: I don’t hear very much about it. I mean, there is a couple of companies that are trying to sell quantified work solutions and so that’s mostly when I hear about it.

CD: When you get pitches?

KF: When I get pitches or when somebody else covers one of those companies. That’s doing something like that. When I go and speak about it, it seems like it’s the first time most people are hearing about it. Yeah, I think there’s some huge issues there in terms of like what happens if your anxiety level goes up when you walk into your office if your employer has access to that . . . going back to some of the stuff Brad Feld was talking about the risks of admitting weakness essentially. Depression is seen as a weakness. The burnout has been an issue.

CD: There’s one of them which is saying you’re depressed but then there’s a judgment of people going well, how long would that last. So, suddenly you’re in this kind of race against the machine. So, they know you’re depressed but now do they think is he sick for months, do I even talk to him and it’s really awkward.

KF: If there’s a way that for your employer to find out that you’re unhappy with your job or you’re just going to get canned immediately.

CD: If we go back 4 years ago, people were worried about their employers would see what they put on Facebook. Are we entering a time we have to be worried about your employers knowing our health data because so much of us . . . that information is being collected. I asked a bank simple I said I want to get my information out easier to get it into my information flow. They said, Chris, the problem with us opening up any way for you to even tweet out your transactions or get some access to your own, my banking information was if you create a way out you create a way in and I said, oh, this is what’s happening with health.

In some ways they don’t want to create a way to get the information out for us because you’re creating this way in and I guess these nightmarish dystopian stories of people hacking pacemakers and silliness like that but you can have security or you can have safety.

LB: Well, I think you can have security or safety. I think that there’s again a kid coming back to I’m not sure that we’re figuring out the technology to do a lot of these things. I’m not sure that we figured out how this will impact us as a society. So, to your point, Klint, the quantified work it’s not just . . . does my stress level go up when I go into work but does my stress level go up when I’m talking to a particular person.

CD: And I know that by the way because I use an ambient noise sensor on my desk.

LB: I know you do and I know that you were talking to me earlier about your reactions to some people. Well, here you can tell your emotional reaction to people. So, what if other people knew that as they’re talking to you or what if your employer knew that you’re having trouble with people on your team or with your management team. A lot of what you do in public, at conferences, in your workplace, and in your family has to do with a happy little dance you’re doing to kind of get around your own emotional reactions.

AW: I don’t think it’s going to come to that. How much more do you rely on data to get your questions answered than people right now, today?

CD: What was the question?

AW: How much more do you rely on data to get your answers than people compared to like a year ago?

CD: Almost exclusively.

LB: Compared to a year ago a lot.

AW: So, do you think that’s going to change? I don’t think so. I think more and more of that thought that we have, the ideas that we have, the knowledge that we have will be pushed into some form of data. So, we won’t be dealing with projects in that way.

CD: Well, I think there’s a chance . . .

AW: You’ll have very minute tasks and the data that you need for those tasks will be available and then it’ll just be a matter of getting it done and so whether you get it done or not I think will still be good.

LB: It’s still binary.

CD: I had a guy at lunch and he said “How much data do you have on yourself?” and I said “What do you mean like how many characters - I don’t know - hard drive size?” “Yeah” he said “Well, how much?” and I’ve never been asked. I said 12 terabytes. “You’ve 12 terabytes information on your activity?” “Yeah, from last 3 years.” “Have you ever done anything with it?” Actually, Stanford has this deep learning engine when I was at Singularity, I dumped it in to see what happened. It was amazing to ask myself questions.

AW: Where did you dump it into?

CD: Stanford has a deep learning engine that’s open and I just dumped a bunch of my data into it to see what it pulled up. You can get to the point where you can ask you questions changes the game because then you send that to work.

AW: Yeah.

LB: Yeah, yeah. Have you learned things about yourself that you’re uncomfortable learning?

CD: Yeah, but I’m not going to talk about it now. I’m going to change the subject on Mindful Cyborgs.

Thank you for listening to Mindful Cyborgs. Join us next time for part 2 of the Defrag Conference wrap up.