15/04/2025 • 11:00 am
This webinar focused on GenAI and emphasized the need for a values-based approach to standards in technology and the potential of AI as a thought partner, especially for SMEs, and addressing prevalent uncertainties while emphasizing responsible adoption. It discussed DEI initiatives, the challenges of job transitions, and the importance of a growth mindset in adapting to rapid technological changes. The conversation also touched on the importance of inclusivity and diverse perspectives in shaping the future of technology.
Video: TBA
Krista:
There’s a way forward, but we need to have these conversations openly rather than avoid them. If we don’t, the problem just festers and nothing changes. We need to find constructive solutions and not shy away from difficult topics.
Krista:
There’s going to be job losses. Humans suck at transitions—we know this. Anytime there’s been a major one, we’ve struggled and it stresses us out. Instead of focusing on what we don’t want, let’s collectively come together and say, “What’s the future we do want? How do we design for that?” If policymakers and educators focus on making transitions as painless as possible, it gets interesting and exciting.
Mark:
We have to go through transition, but it’s about minimizing the negative impacts. Our audience is often people in transition—job seekers, students, people preparing for the future. These conversations are especially relevant for them.
Krista:
Change is never going to be as slow as it is today. For those feeling whiplash, that’s a terrifying statement, but the good news is we’ve been here before. Humans know how to adapt. Moore’s Law shows us that the power of our technologies is doubling at an exponential rate, and that’s driving change. The difference now is the speed—past generations faced similar anxieties, but at a slower pace. We can look to history to learn how to manage transitions.
Mark:
It’s important to keep looking forward and recognize this is the new reality. Becoming comfortable with discomfort and finding ways to adapt is really important. Skill cycles are now as short as six months, so mindset is crucial.
Krista:
I look to a growth mindset, which I expand into a “futures mindset.” There are an infinite number of futures out there, and we get to choose the ones we want to go towards. That’s exciting and helps build resilience around uncertainty. The job I have today didn’t exist five years ago. What matters is learning how to learn, being critical, thinking in new ways, and continuously bringing new skills into your skill set. The skills and capabilities of critical thinking, adaptability, and learning are what matter most.
Krista:
I use AI every day in multiple ways. I see AI as our collective brain—it helps us tap into what humanity has already learned and created, but it still needs us. AI can’t ask questions that have never been imagined. I use AI as a thought partner to help me think in new and creative ways, break biases, and generate new ideas. For example, I use ChatGPT to help my kids understand complex topics or to prepare for job interviews. It’s a powerful tutor, but you have to ask it good questions and always review its output.
Mark:
Using AI as a thought partner is super powerful. Be intentional with your prompts—give AI a role, like “You’re Warren Buffett, review these financial statements.” Ask for alternative perspectives to avoid echo chambers. It’s not just about getting an answer, but about deepening your thinking.
Krista:
We’re being fed data through algorithms that reinforce specific worldviews and close us off to other ideas. We can use these technologies to break down echo chambers, but we have to be proactive. It’s critical to bring everybody to the table when designing technology and standards, so we don’t perpetuate bias or exclusion.
Mark:
We’re seeing a significant increase in things like elder fraud using misinformation. It’s important to recognize these challenges and work to ensure our communities are aware and informed. We need to be vigilant about the downsides of these tools, even as we benefit from them.
Mark:
I use the example of the 1904 Baltimore fire, where fire departments from different cities couldn’t help because their hoses didn’t fit the hydrants—there were over 600 different ways to connect hoses. Lack of interoperability hindered disaster response. We need to think about interoperability standards now, before a crisis, so we’re ready. AI is like fire: it brings huge benefits but also risks if not managed properly.
Krista:
Standards are more relevant today than ever. The way we create standards needs to evolve to a values-based model. Technology is moving so fast that by the time you agree on a standard, it’s outdated. We need to agree on principles and values, not just technical details, so we can be proactive and adaptable.
Krista:
I believe I should have control over data about me and be able to call it back. Technologies like blockchain are interesting because they let you track where your data has gone. But there are real risks—look at what happened when a major DNA testing company went bankrupt and all that personal data became an asset for sale. Always ask yourself if you’re comfortable sharing information, not just for what it’s used for today, but knowing it could be out there forever. Read privacy policies and consider the long-term implications.
Mark:
I support Krista’s caution—always read privacy policies and think about the long-term implications of sharing your data with AI tools.
Krista Pawley:
Vivian Ming once said, if a student turns in nothing, they get an F. If they turn in something generated by ChatGPT, they get a D—it shows they did the work, but it’s generic. The difference between a D and an A is what the student does with the AI-generated response: how they evolve it, what unique thinking they bring, and how they advance the conversation. Professors shouldn’t ban AI; instead, they should teach students to use it intelligently and productively. The key is to ask, "What questions does this prompt in me? How might I advance this? What perspectives can I add?" AI can’t imagine what hasn’t been imagined—that’s the student’s role.
Mark Patterson:
Most people don’t work in the field they studied. Software development is a great way to learn how to learn. The real value comes at the intersection of your skills and your passions. Pursue your interests—solutions often emerge where your passion and skills overlap, such as applying tech skills to sustainability challenges.
Krista Pawley:
A practical tool is the game "The Thing from the Future," which prompts creative thinking about future scenarios. Fun and creativity enhance learning. The overlaps between your interests and your technical skills are where innovation happens. Your curiosity about sustainability will lead you to see applications for technology that others might miss.
Krista:
ChatGPT can help automate tasks. For example, you can ask it to create a LinkedIn campaign and generate a calendar file with daily posts. For more advanced automation, you can use custom bots or no-code platforms, but always review privacy and ethics policies before using new tools.
Mark:
This is the year of agentic AI—tools are emerging that let you automate processes across platforms without coding. The real opportunity is not just automating existing tasks, but rethinking processes from the ground up. Always consider privacy and data security when adopting new tools.
Funded in part by: