This month we explore the themes of culture and language within the technology space. Future State speakers Mikaela Jade, Founder of Indigital, and Danielle Krettek Cobb, Founder of Google's Empathy Lab, share their insights.
Age is just a number
Longevity is not an achievement. Not when your cultural place, your country, is 60,000 years old, says Cabrogal woman Mikaela Jade of the Dharug-speaking Nations of Sydney. “We’re not particularly proud of how long we’ve been practising our culture. It just is – and has been since the beginning of time.”
At Future State, Mikaela shared why First Nations people must tell their own stories, and technology's part to play in this. She referred to her time as a park ranger in Kakadu National Park in the Northern Territories – one of the most remote places in Australia. The park's story was mostly told through the lens of anthropologists and archaeologists, with a focus on its age. This made Mikaela question if there was a way to tell the story how a First Nations person would tell it.
In 2012, Mikaela saw augmented reality (AR) for the first time. "I imagined a world where we could put our phones up and have our elders appear in holographic format – to tell the right story, at the right time, at the right place, in the right language and in the right season. So I set about building that.”
Two years later, she presented the AR storytelling application at the United Nations. First Nations people from all over the world approached Mikaela, wanting her help to tell their own stories in their own way. With an AUD $200,000 business loan, Indigital was born, and an app that would enable anyone at any age – from five to 105 – to tell their story.
Being a trailblazer is an exciting concept. But it’s a tough reality for those, like Mikaela, who find themselves navigating unchartered territory.
“It's been particularly difficult to be a woman – to be a First Nations woman – and to be working in critical technologies. There wasn’t anyone I could look to in my tribe, or in other mobs around Australia, or even internationally, that had succeeded in this technology. So I had to be somewhat of a trailblazer, and make a lot of mistakes,” Mikaela says.
“But what has been amazing along the way is to be able to work with 9000 young people in our country, and inspire the next generation of digital makers...to see ourselves as technology experts.”
It’s not enough to enable First Nations people to tell their stories. Mikaela wants to see them actively participating in technology, and strives to steer young people away from low-paid jobs and into high-income careers.
Who can say they’ve hacked Google? Zedekiah can. He managed to replace the Google Doodle with his school crest. His school's reaction was to ban him from computers, and his careers advisor suggested he channel his technology skills into becoming a diesel mechanic in the mines.
Mikaela and Indigital spent some time with Zedekiah and his family. A simple question was asked – what do you want to do? Being a proud indigenous man was important to Zedekiah, as well as using robotics to find culture under the sea. "So we talked to him about mechatronics and the opportunities in the Navy, and he’s on a different pathway now,” says Mikaela.
Teachers, advisors and parents have a huge influence over young people’s career choices. When it comes to First Nations people, Zedekiah’s story shows that the least we can do is expose young people to a wider range of professional options.
Technology encompasses everything from practising with the boomerang to machine learning. As such, First Nations people are no newcomers to technology, and yet they can still find themselves excluded. Mikaela says it’s frustrating when technologies aren’t created for the people who will use them, and it can “amplify the gap that First Nations people experience in places like Australia”.
Alternatively, when technology is created with indigenous knowledge, language and lore embedded in it, the result is much greater. As Mikaela says, it "helps us not only express ourselves as cultural people, not only see ourselves in future technologies, but also if we do it for ourselves, we’re actually making technology that works for the rest of the world.”
Indigital has been paving the way in critical technologies for 12 years in Australia, working with 170 different communities across the country.
Mikaela believes non-indigenous people can assist not by offering to lead, but by creating space for First Nations people to learn and work in tech. When that happens, everyone wins.
“I think we’ll actually teach governments and businesses a thing or two about bringing forward the oldest knowledge systems in the world to their business operations,” she says.
As Indigital becomes more involved in critical technology infrastructure projects and Metaverse technologies, Mikaela stresses that indigenous knowledge has to become more highly-valued.
“We need to have our knowledge systems valued equally to Western professionals,” she says. “For too long our elders have felt grateful to be part of these projects. We’re really advocating and pushing very strongly back in Australia that the knowledge our elders hold is equivalent to that PhD over there in engineering, or it’s the equivalent to that lawyer over there".
Intergenerational knowledge is not a one-way street when it comes to technology. When elders learn alongside young people, richer applications emerge.
While young people are often excited about the project, they recognise they don’t have the cultural authority to embed the knowledge, language and lore in these new technologies. They need the elders in the room.
These conversations can only happen when First Nations people see themselves as protagonists of a digital future, rather than victims of it. As Mikaela explains, “a lot of our communities in Australia unfortunately have been shoe-horned into low-skilled labour roles, and these roles are disappearing because of robotics and machine learning.”
Enabling First Nations people to access the digital economy starts with ensuring they have access to devices and an internet connection – a big challenge when communities are scattered across the vast Australian continent.
Despite the barriers, learning to be digital can begin in a very analogue way. “We teach machine learning algorithms using paper cups and sticks as a methodology,” Mikaela says.
There is also an understanding that technology, like everything else, comes from the land. It is made up of precious mineral resources that must be honoured and considered. “So not just thinking about the nuts and bolts of the technology, but thinking about the entire pathway of creating a piece of technology that we’re using,” she says.
Creating a pathway into technology careers for indigenous people starts at the top, with the CEO and leadership team. There needs to be investment in initiatives that encourage First Nations people into tech roles. What's also important is that these roles are described in ways that are appealing to young people.
“[Business leaders] need to approach First Nations communities and spend time with our mob, and understand the way that we speak about critical technologies and the world around us. It’s not our responsibility as cultural people to try and translate the opportunities that are within a business. It’s the other way around,” Mikaela says.
Equally, when working with First Nations communities, Mikaela stresses the importance of true collaboration from the start. Cultural knowledge can be embedded into every part of the technology stack, and by involving the entire community, young people can see first-hand the power of tech.
“They start getting interested in engineering, in mathematics, and then all of a sudden, our mob have pathways into these careers that are operating on their country anyway. Whereas before they would have seen that as something else, now they can be part of it.”
When businesses meet First Nations people in their communities, powerful things can happen.
“In Australia, we have some really great initiatives where senior executives spend weeks on country, living and working in Aboriginal communities — to immerse themselves, to ask questions in a safe place and to learn,” says Mikaela.
Spending the time is a way of earning the right to learn the knowledge of First Nations people. This knowledge can then be used to create better products. An example of such product innovation is the use of spinifex – a spiky, venomous grass – in condom manufacturing.
“Western science and indigenous science came together, and in the lab they were able to create the world’s strongest latex through using these two incredible ways of thinking,” Mikaela says.
Another example is honey, where the provenance of its location adds to its value, further enhanced using geo-specific information. Partnering with indigenous people can also lead to better understanding of ways to reduce a company’s carbon emissions.
In comparison, when indigenous people enter business environments, they can find them unwelcoming spaces. Mikaela notes the high number of First Nations people attracted to roles in large corporates, only to leave within 18 months. Cultural misalignment can be felt in anything from employment policies to dealing with customers.
“The other big trap that businesses fall into is putting the cultural load onto the First Nations people to do everything that is cultural," Mikaela explains. "They’re asked to do anything that's related to indigenous initiatives. They have to take that on."
On the other hand, when organisations do involve First Nations people in senior leadership, it can be very successful for both parties. It’s therefore important to provide a culturally safe environment for those starting out, to enable them to progress through the ranks.
“We have a lot of people that are in traineeships or are situated in lower levels of an organisation, where there’s a massive opportunity to bring in our skills, knowledge, relationships — ways of knowing, being, and doing — through all levels of the organisation,” Mikaela says.
As Founder of the Google Empathy Lab, Danielle Krettek Cobb has spent the past four years working with large language models. But it's only in the last six months, with the sudden and widespread popularity of AI tools, that people are starting to understand her work.
Chatbots are trained "on billions of data that we would call conversations,” Danielle says. “Basically...they’re training on our language. They’re training on the way we relate.”
What these chatbots aren't doing is training on inclusive data sets. According to Danielle, there's a missed opportunity for organisations and businesses wanting to help their people connect with their stakeholders and customers.
The wider the language model, the more interesting the conversation.
“One of the things I was looking at in the beginning was race dynamics," says Danielle. "There is an unfair burden on minorities to educate majorities about how things work, and what the experience is like for them in their community. What I think is interesting is conversations can be practised in a way where there are not high stakes.”
If it were possible to learn from AI chatbots trained in language sets of multiple perspectives, Danielle says we could get more comfortable talking to people with different life experiences. We could rehearse an encounter with an AI conversation engine assistant and gain a better understanding of someone’s point of view, before meeting in person.
AI can be a powerful tool used to connect people and communities, rather than disempower and destabilise them. “How can we use [chatbots] and train them on the best of us, on the most cutting-edge science, on the most deeply interesting human traditions of our species?” questions Danielle.
Through her work with the NASA Jet Propulsion Laboratory, Danielle sees the potential of AI to connect us as a species in the same way that space exploration has.
When the Voyager spacecraft turned its cameras back towards our solar system and took images labelled by NASA as ‘the family portrait’, we saw a very fragile Earth. This realisation led to better climate change conversations.
Collaborative AI could have the same positive impact on humanity. “What will we become aware of that’s so meaningful and useful and critical and necessary? What will save us because we can understand something from a different perspective?” she asks.