LEXI MILLS



Ethical Approach to Search with Lexi Mills

The first time I saw Lexi Mills speak, it blew my mind. I’ve been in the industry for over half a dozen years and never once did I think about the ethical approach to our search results. A massive thank you to Lexi for taking time out of her busy schedule to come and be a guest on the podcast.

Feb 16

Trust me, it’s a must listen episode so download it on iTunes now.

About Lexi Mills:
Runs an agency called Shift6 (shift6.org should be live in a few weeks), working with brands and agencies, focusing on integrating PR with SEO, reputation management, ethics of search algorithms and other thought leadership programs.
More on leximills.net
Tweet her using @leximills
Ex-Distilled digital marketing agency focusing on content and digital PR.
Started out in the music industry.
I met Lexi at Search London where she was a speaker.
Where it all started; The Ethical Approach to Searching
Lexi started out in SEO as someone who build links naturally for her clients. This was the ethical way of doing things the right way and not tricking Google because this meant you’re less likely to get caught. Lexi, at first, didn’t have a search background and rather it was more PR based so she was able to get into conversations that were detailed and involved, which potentially meant she missed out the middle. Lexi learnt what is black hat vs what is white hat SEO.

As she continued to work in the search industry she started getting interested in Google’s aspirations. Where does Google want to go next? As this was and is primarily to serve the user, the penny dropped…

The internet is how most people determine a reality, a truth, an answer and a direction, whether that’s metaphysical or physically getting somewhere.

This truth is governed by search engine and a corporate entity which to Lexi sounded like a disaster so she started researching further into the history of communications that dominated the world, dominant groups and how technology has been a part of this also. She looked into the first TV advert, first advertisements in general and people’s reactions to this. She was interested in how people were adverse to a new technology back then and how it evolved from being something new to something that was wide-spread. So, she is trying to work out where this is good or bad.

Ethics in Localised Search & Personalisation
Localised search or customised search is great, until you realise that it serves the intent but blocks out perceptions.

Lexi Mills is from South Africa and found that living in UK or US, that she had no visibility of other countries, such as her home. For example, there were whole newspapers that were printed black because freedom of press was revoked in South Africa. This wasn’t something that came up on her news feed when she would have been very interested to know about it.

The filter bubble effect. 

Lexi experimented with her own filter bubble, when she was moving to New York, by finding all the coolest people she knew from an intellectual and aspirational standpoint and making a spreadsheet of all the pages they ever liked, every place they’ve ever checked into in NY and and found the correlating factors. She liked those pages herself and found that she felt like she’s been in NY for years when she’s only been living there for a small period of time.

The filter bubble, being locked into an eco-chamber, can be good or bad. You can accelerate your integration into a culture very quickly and design what culture you want to be a part of. Lexi is therefore into data-surrender because we spend a lot of time trying to protect ourselves which is a lot of time and energy and at the same time she isn’t convinced we can do this anyway. The fact that data was available to Lexi was helpful in the experiment above and it’s the moral standpoint of the person using the data that maybe we have to question.

When it comes to opinion based filter bubbles, such as political views, I remember reading something about where sometimes we need those conflicting opinions to either sway our thought or to help us feel stronger about our opinion. Lexi said that there are studies that show that we prefer to hang out with people that agree with us. These also suggest that we don’t want conflicting opinion due to our time spent on the page and bounce rates results. It’s almost too used to being agreed with.

The dangerous effect of the filter bubble therefore is that we don’t know what we’re searching for and so we’re shown only what we know rather than what we might need to know. A metaphor for this is The Matrix, where Neo is offered a blue or a red pill; the filter bubble stops your knowing that there’s an alternative pill. It’s more about not realising that you’re in the chamber that’s dangerous.

Lexi is also concerned about international branding. We want to craft an imagine of the west to third world countries that will not cause adversity. In slavery times, slaves were not taught to read or write, largely so that they wouldn’t know what they were missing. The third world countries therefore, looking at the west through the window of the internet… “you can see how The Hunger Games start to come about” says Lexi.

Unfortunately we don’t get to pick when we’re put into a bubble but we can at least be aware about it.

We had a bit of a tangent later on…

It would be great if we could learn to understand each other through search.  It would be good to have another search engine that can offer this, showing us the advertisement and personalisation another type of person may see rather than ourselves. Lexi would love to jump in and out of our bubbles and share our bubble with each other, and I totally agree!

Can Search Technology Predict Illness
Search has great power. It governs reality, it could govern health and it could govern our connections with people in either good or bad ways. Lexi therefore spends as much time to trying to get people to do this in a good way.

If a search engine is able to predict that you have an illness due to the questions that you’ve typed into it, that would be awesome. Especially those in poverty, who don’t have money to go to a doctor, may be able to survive due to this information.

The internet is made up of three points:

The stuff we put on it
People we interact with and those data points
The people who writing or refining the algorithms
Two of those points, largely, belong to the public and so it belongs to human kind. We should therefore have access to it so it can help us.

Wearable Technology; Wearable Data
Wearable technology will know more about us than we know about ourselves using our data. If this could be fed into a search engine like Google they’d be able to know when we’re in a heightened state of emotion and perhaps diagnose more illnesses.

Lexi gives the example of having an argument with a friend and then typing in something like ‘bar’. It would be great if Fitbit data could recognise that you were in a heightened state of emotion and offer you results based on it, such as telling you to go for a run because of this or these are the venues that you asked for but these are the places that your friends are right now. This could fundamentally change our relationship with data if it’s used in this way. Whereas Facebook near by, is best used for avoiding people, Lexi suggests, so it can go both ways.

What does Ethical Search mean if you’re in SEO
There’s Google, there’s us and then there’s the users.

Google dictates what shows up, we try hack it (or please it) but as SEOs we get to control that. It’s exciting that we get to present the searches, offering either the blue or the red pill (see The Matrix mention above.) Lexi also believes we’re closely related to the future of healthcare where search behaviour is exceptional. The industry may see the same skillsets being used in the future to tap into this.

Ethical points on Image & Voice Search
Voice search isn’t something Lexi is excited about. She says that it’s the evolution of Google offering instant answers to the user and is concerned that it isn’t offering the background information because the user doesn’t need to look at their device to get it. It’s unlikely that the user will do research around their subject when using voice search but at the same time believes we may not use it for everything because there are simply some things that you don’t want to be speaking about in public.

There are however high implications of ethical representation via image search. Lexi provides examples of stereotypes, be it racism when we make a search on terrorism or sexism when looking at images for a specific job role. We put ourselves at risk and other people at risk too. Lexi wants us to hack all of those gender stereotypes as a project for search professionals!

Mental Health
Efficiency isn’t always healthy. The faster we get information the faster we can attack ourselves. So, people with mental health issues are often told to switch off Apps that may bombard them with information to gain a healthy sense of calm. We need to be careful and perhaps think about what information needs to be presented for certain types of questions. Maybe Google can be a philosophy search engine, asking you a series of questions.

Lexi gave a personal example in a talk I previously saw her speak at, which explained how she had a panic attack on the train after she searched for the life expectancy of her close relative who was diagnosed with cancer. In hindsight, it would have been healthier for Lexi to see a selection of questions that ask things such as how long have they had the illness for, how old they were and other questions that could provide a better answer than simply a table with data which shocks the mind. Information needs context and that’s a challenge when the user doesn’t want to read it.

So, when it comes to featured snippets for similar searches such as above, something that is worth asking is will anyone benefit from having this featured snippet? Would it be healthy for them to have this and should there be a way to appeal it. Is there something that we can produce to override their impulse for instant answers that are unhealthy? The metaphor Lexi gives is where a cupcake is next to a celery stick and persuading people eat the celery really quickly.

Ethical Issues with Artificial Intelligence & Virtual Reality
Artificial Intelligence:

I have a friend who uses an AI for his meeting and calendar organisation and I was suspicious of Lexi’s assistant. Lexi confirmed that her assistant isn’t an AI but she has looked into this. She states that there’s just some things that artificial intelligence can’t do or understand because her human assistant knows where she may need a rest and how to organise her schedule whilst keeping her healthy.

Many people see AI taking our jobs and although that is inevitable, Lexi and I discuss how it’s similar to the industrial revolution. These doors swing both ways, because although it’s easy to be negative as we’re risk managing our survival, for every con there’s an opportunity. Historically, we find other things to do with our time, so perhaps this time it’ll also be the same.

Virtual Reality:

Lexi had a meeting using virtual reality where the other attendees haven’t met her before and found that the meeting went well. When they had a Skype call the other attendees notably were taken aback at how young and petite Lexi is. This made her think that actually the meeting may have gone better than usual because they did not see her in person. Perhaps unconscious (or conscious) stereotypes may have more of an impact on our day to day life and work than we realise. A virtual reality meeting can therefore offer a ‘fair playing field’ and everyone can become aware of their own unconscious bias.