Mon, 25 September 2017
Today’s guest is Seth Stephens-Davidowitz, author of Everybody Lies: Big Data, New Data and What the Internet Can Tell Us About Who We Really Are. During our conversation Seth talks about what it was like to work at Google and why he left, how he went about analyzing the data for his book, why he believes we are all liars, and what he learned about our true human nature.
Seth Stephens-Davidowitz has used data from the internet -- particularly Google searches -- to get new insights into the human psyche. A book summarizing his research, Everybody Lies, was published in May 2017.
He worked for one-and-a-half years as a data scientist at Google and is currently a contributing op-ed writer for the New York Times. He is a former visiting lecturer at the Wharton School at the University of Pennsylvania. Seth received his BA in philosophy from Stanford, and his PhD in economics from Harvard.
The area of big data that Seth researches is ‘social science questions about what people want and need’. It is very straightforward based on information that humans create. (Like from Google or Facebook)
Traditional social science experiments take months but today it is possible to experiment in minutes using such resources as Facebook.
When asked what a data scientist is, Seth said that it is someone who knows how to code and build models of human behavior to predict what people will do and what will work in the future.
For his book, Everybody Lies, Seth used Google searches to measure racism, self-induced abortion, depression, child abuse, hateful mobs, science of humor, sexual preference, anxiety, son preference, and sexual insecurity, among many other topics.
Just a few of the topics discussed in the book are sex, searches for sons vs. searches for daughters, anxiety, and insecurity.
When asked questions about these sensitive subjects, people may lie. But searches indicated a variety of areas people search - areas that people don’t talk about. Therefore, people seem to have more interest in these topics than they are willing to admit.
Searches with the term ‘daughter’ are most often asking about issues related to appearance. For example, ‘How can I get my daughter to lose weight?’ For the term ‘son’ it is often, ‘Is my son gifted?’ There seem to be marked differences between sons/daughters in the searches that use these two terms.
While common thinking may be that those living in large urban areas such as NYC or San Francisco are more anxious, Seth’s research showed that searches for these terms was higher in Kentucky, Rhode Island and Maine - and in rural areas, contrary to common thought.
Stereotypes are often wrong. It is often assumed that women have many more insecurities about their bodies. However, the data does not show an overwhelming number of women versus men searching about these topics. In fact, about 60% were women and 40% of searches are men – not a ‘blowout’ on the side of women - that might have been thought.
Seth’s advice for individuals living in this new data world is to understand that Google has a lot of incentive monetarily to keep our data private, so he is not worried. One thing he is concerned about is that we may enter a society where we put resources such as time and energy, towards how we present ‘on paper’, because we are worried that we might be penalized based upon our ‘paper trail’ - and that could become a problem.
His advice to organizations is to use A/B testing (analyzing what people click on) is highly effective and should be used even more.
What you will learn in this episode:
● Surprising most often searched terms in Google
● Advice for individuals living in this new data world ● Tips and discussion on Google Trends – website with data that is available to everyone
● What was it like working for Google and why he left
● How Seth analyzed the data for his book
● Why Seth believes we are all liars
Mon, 18 September 2017
Michael Bungay Stanier is the founder and senior partner of Box of Crayons, a company that works with organizations, ranging from AstraZeneca to Xerox, to help them do more great work. A Rhodes scholar who earned both arts and law degrees with highest honors from Australian National University and an Master’s degree from Oxford, he is a popular speaker at business and coaching conferences, and was named Canadian Coach of the Year in 2006
He is also the author of a number of books, his latest book, The Coaching Habit: Say Less, Ask More & Change the Way You Lead Forever, was published in February 2016 and is a bestseller.
Bungay Stanier talks about how it is possible, in 10 minutes or less, to ask strategic questions to drive changes in behavior, have a more engaged, smart, autonomous team that will allow you to work less hard and have more impact …if you stay curious.
The 7 essential coaching questions that he talks about in his book are:
To have authentic conversations, the culture needs to be one in which employees feel safe to share. Consider TERA when considering your work environment.
TERA stands for:
Tribe – make it feel like ‘you & me’, rather than ‘you versus me’
Expectation – how do I know what is about to happen
Rank – how do I feel the same as you rather than less than you
Autonomy – how do I get to make some of the choices, rather than you telling me everything I need to do
Bungay Stanier advises employees who want to be coached by their managers to be the change they want to see in the world. Practice being more coach like yourself. Ask your manager for what you want (and buy the book!).
His advice to managers who want to get started as coaches is to pick one thing and see if you can get some traction on that. Go to coaching.com and download the first few chapters. Pick a question, build a habit around it, practice it and when you fall off the wagon, start again.
What you will learn:
Mon, 11 September 2017
Cathy O’Neil is a mathematician who has worked as a professor, hedge-fund analyst and data scientist. Cathy founded ORCAA, an algorithmic auditing company, and is the author of Weapons of Math Destruction.
Cathy says she was always a math nerd. She loves the beauty of mathematics, and says it is almost an art – the cleanliness of it. One of her favorite things is that math is the same no matter what country you go to. She also had had an interest in the business world, which led her from academia to work as a hedge fund quantitative analyst.
Big Data is both a technical and marketing term. The technical term depends on the technology you are using. Big data used to mean that it was more data than you could fit on your computer – now it means more that you can perform in a simple way – that it needs to be put it into another form before it can be used.
The marketing term, ‘big data’ is misleading. However, it represents the belief that you can collect data for one thing but then the same data can be used for another purpose. “It is a technology that allows us to collect seemly innocuous data and use it for another purpose.”
One profession in which O’Neil has at looked at the use of big data and algorithms in detail – and discusses in her book – is teaching and their evaluations. She says there were teacher evaluation algorithms originally designed to eliminate the achievement gap between ‘rich kids and poor kids’. Eventually, a new system was devised entitled, ‘value added teacher model’.
The idea of this new system intended to offset the previous way of looking at assessing teachers - where they solely looked at the teacher’s students’ final test scores.
The ‘value added score’ system holds teachers accountable for the difference between students’ final score and what they were expected to achieve/receive.
O’Neil says that this method ‘sounds good’ and seems to ‘make sense’. However, with only 25 (or so) students in one teacher’s classroom, there is not enough data. Additionally, both the expected and actual scores have a lot of uncertainty around each of them. So this final number ‘ends up not much better than a random number’. With that, there is not enough credible data to base important decisions such as terminating a teacher’s job.
One of O’Neil’s main points in today’s discussion is that every algorithm is subjective. Whether it is used to evaluate teachers, hire or fire employees in a financial organization - people should know that they have the right to ask the algorithm explained to them. The 14th Amendment provides them ‘due process’ to ask why they are terminated, not promoted, etc. - other than just alluding to a algorithm result.
What you will learn in this episode:
Mon, 4 September 2017
Perez-Breva, PhD is an expert in the process of technology innovation, an entrepreneur, and the author of Innovating: A Doer’s Manifesto for Starting from a Hunch, Prototyping Problems, Scaling Up, and Learning to Be Productively Wrong. (MIT Press 2017).
Currently Perez-Breva directs the MIT Innovation Teams Program, MIT’s hands-on innovation program jointly operated between the Schools of Engineering and Management. During his tenure, i-Teams has shepherded over 170 MIT technologies to discover a path to impact. He has taught innovating as a skill worldwide to professionals and students from all disciplines; and has gotten them started innovating from pretty much anything: hunches, real-world problems, engineering problem sets, and research breakthroughs.
There is a lot of confusion around the term Artificial Intelligence – AI. What is it?
“Today AI is essentially an aspiration. What we do have is – a lot of – automation, machine learning, data learning and robotics.” The dream is to have a partner. Google show how you would operate with AI. You go into Google, use keyword and can get the information you need. We are all more powerful because we can so readily go onto Google to find answers. Siri and Uber are neither really ‘intelligence’. Intelligence is much harder than what we thought.
Does Perez-Breva think job displacement will happen? He believes we are confusing AI with automation. Automation has always made jobs ‘disappear’. For example, gas lights, now we have light bulbs. We have always had jobs be lost to automation. The question is to how do make sure we are training leaders so that they are creating those new jobs into the middle class.
Automation can create gateways to the middle class – such as Ford did 100 years ago. If you don’t find a new job, it is a lack of imagination.
Robots are in all of our local coffee shops – are they taking the jobs of humans?
Not as easy as it might seem…the number of robots that would need to be produced and maintained is massive. One robot in one coffee shop is an example of human endeavors but one in every coffee shop seems a bit of a reach.
What you will learn in this episode: