S2. Ep1 - AI in 2025: The Year Everything Changed + Our 2026 Predictions

In this episode, Katie and Noel kick off 2026 with a look back at the whirlwind that was AI in 2025. From DeepSeek sparking the reasoning model race to ChatGPT images breaking the internet with Ghibli trends, it's hard to believe how much has changed in just 12 months.

They reflect on the rise of AI agents, how platforms like Make.com and N8N scrambled to keep up, and the game-changing arrival of MCP (Model Context Protocol). Plus, they discuss how AI-generated images have become so realistic that stock photo sites are now selling them.

Most importantly, they share their bold 2026 predictions, including why Google might overtake OpenAI, what Johnny Ive's mystery hardware product could be, whether AGI is actually coming this year, and how AI could start impacting the job market.

if you build AI agents in Make and/or n8n and need an easy way to chat with them on go from your iPhone, then check out Noel's new app AgentChat on the iOS app store.

Download AgentChat

How to find us:

Join our membership over on Skool, where we support you on your AI and automation journey. We share exclusive content in the membership that shows you the automations we talks about in action how to build them. Find out more about the AI Business Club here.

We have a free LinkedIn group (AI Automations For Business), the group is open to all.

New for 2026, you can also find us on Substack, click here to subscribe and get all the latest news and updates from us.

If you would like dedicated help with your automations or would like us to build them for you then you can find our agency at makeautomations.ai

  • Katie (00:27)

    Hello and welcome back to another episode. Happy new year to you. Hi, I'm Katie. And as always, I've got Noel here with me. How you doing this week, Noel? So this week we are doing a sort of roundup of the AI world in 2025.

    Noel (00:35)

    Absolutely fabulous and happy new year to everyone.

    Katie (00:54)

    And we're also going to cast some 2026 predictions, which I think is actually going to be hilarious because 2025 was so wild. Yeah, I just think whatever we think is going to happen, it's just probably going to go in a completely different direction.

    Noel (01:05)

    It was. It was the Wild West. Yes, or they'll do it within like the first quarter.

    Katie (01:24)

    Yeah, by the end of the week, before our podcast is even out. They've kind of got a habit of doing that, haven't they? We talk about something on the Tuesday and by the time our podcast has gone out on the Thursday, everything's changed. You can't believe how much changed some weeks within 48 hours, sometimes within 24 hours.

    Noel (01:43)

    Yeah.

    Katie (01:54)

    Sam Altman, what are you trying to do to us?

    Noel (01:55)

    Exactly. OpenAI got into a habit of releasing live YouTube videos at 6pm on a Tuesday. So we'd already recorded the episode in the afternoon and then they go, "Oh, we're going to bring out this incredible thing. Come and join us at 6pm." I was like, oh, come on. It's good for us though. There's plenty of content for us to talk about, which is brilliant.

    Katie (02:24)

    For sure, for sure. And as well, I feel like if people haven't got into AI or automations, then I feel like this is definitely the year you want to get involved. I feel like no, you haven't missed out. You can still learn so much and get so far ahead within a matter of months. Because yeah, everything does change, but I feel like a lot of the foundational things that you need to know don't tend to change that much. Would you agree, Noel?

    Noel (03:05)

    Yeah, like prompt engineering is always going to be a big thing for now. So getting that basic knowledge is always good.

    Katie (03:20)

    Honestly, Noel is the prompt engineer master. He's excellent at it. So do reach out to Noel if you need any help with that. Okay, so Noel, shall we start with just a few of the things that happened in 2025, just to remind ourselves how much AI and automations have changed within the last 12 months?

    Noel (03:54)

    Yeah, so I guess to start off with, we entered 2025 with ChatGPT 4.0. We had Claude models at version 3.5 and Gemini 2 had just come out at the end of 2024. It just seems absolutely crazy that that was only 12 months ago.

    Katie (04:20)

    That feels like years ago! It should be years ago!

    Noel (04:25)

    Yes, it should. But yeah, I had to go back and double check when these came out, just to make sure. But back then, in January and February, we had no real reasoning models within our LLMs. So today we would open ChatGPT and we'd expect it to think about the response before giving it.

    Katie (04:37)

    Yeah, to me that's just basic.

    Noel (05:02)

    It is now. But back then in the olden times, we didn't have that. This is absolutely bonkers. And obviously all that kicked off because DeepSeek, the open source Chinese model, came out and said, "Hey, look, you could use our new model. It does all this reasoning. And it's like a tenth of the price." And you could also download it and host it on your own systems for free. So that kind of sparked a huge change within the industry. OpenAI, Anthropic, Google, all seemed to scramble to quickly release these reasoning models. And initially they weren't great, but they're far, far better come the end of the year.

    Katie (05:53)

    So it'd be really exciting actually to see what they look like by the end of 2026. I wonder if next January we will be going, "My goodness, can you remember when LLMs only did this?" And it's really funny, although DeepSeek really led the way, it paved that path, didn't it? And now you don't really hear much chatter about them.

    Noel (06:36)

    No, so they also, as well as releasing that reasoning model, they also beat quite a lot of the mainstream models in most of the benchmarks. So it was actually a lot smarter, which I think is probably what caused a bit of worry with the big companies. But actually DeepSeek did release a new version towards the end of 2025. I think that was probably early December, late November. They snuck it out. No one really talked about it, but they did release it and it's kind of up there again in the benchmarks against OpenAI, Anthropic and Gemini. So they're still around. It's crazy that that didn't exist.

    Katie (07:17)

    Yeah. The next thing that I would really love to talk about, because for me these particular things really stand out from my line of work with marketing and community management and strategy, is AI images and AI videos. I feel like they are unrecognisable from 12 months ago.

    Noel (07:50)

    Yes. So we entered the year with OpenAI still having DALL-E 3, which was absolutely awful. It was terrible. You could never get it to create anything that was half decent, whereas other open source or independent companies were coming in and their models were a lot better. So we had things like Flux from Black Forest Labs. That was huge last year because they were really getting into the photorealistic stuff, so you could create those images that look real and everyone had the right number of hands, feet and fingers.

    Katie (08:47)

    Yeah. Fingers, toes, eyes.

    Noel (09:02)

    So it was starting to become a bit more real. And then obviously OpenAI released their images within ChatGPT and the internet just went crazy. Trends left, right and centre. I'd never really heard of Studio Ghibli until that trend came out. And then we also had all those model toys in boxes, didn't we?

    Katie (09:38)

    God, it's something I don't really want to remember, if I'm honest. Like I said, the first few were really entertaining. But what I do really love about it was how much it was actually getting people involved with AI. Maybe people who would have never used AI before were like, "Well, I want to create my own, so I'm going to try it." I feel like that's what I have particularly loved about these big viral trends. It's got people who wouldn't necessarily be thinking of using AI to start using it and playing around with it.

    Noel (10:07)

    Yeah, they just made it so easy by just asking ChatGPT to create an image and it goes, "Yeah, sure. There you go."

    Katie (10:30)

    Yeah, and it blew people's minds and obviously they felt they needed to share it with the internet.

    Noel (10:37)

    Everyone did. Although I do get a bit worried about the copyright with some of the stuff that comes out of there. Because I remember I created some of those toys and stuff and I said, "Could you put it in packaging that's a bit more on brand with Apple, that sort of sleek packaging?" And it went, "Yeah, sure." And it was like, "Here's the Apple logo." And I was like, "No, I'm not sure we can use the Apple logo."

    Katie (11:07)

    We don't want Apple lawyers on our case. Thank you very much. Not literally copyright infringement.

    Noel (11:21)

    Exactly. Absolutely bonkers. But they've also improved it. I mean, that was only version one, but we're currently on 1.5. That is actually a lot better. I find it's still not the best. I think Google kind of stole that with Imagen.

    Katie (11:45)

    If anyone ever asks me what I can use to create AI images, I always recommend Imagen. And I've noticed as well, particularly on TikTok, so many people using Sora. I've seen some really realistic looking videos from Sora and I've had to kind of go, "It's AI." It's so good.

    Noel (12:19)

    Yes, or Veo 3 from Google. I mean, they released Sora and then Sora 2 last year. The first Sora was pretty rubbish. They kind of cherry picked the good results that they got out of it and put that in their marketing. But I remember trying it and I was so disappointed. I was trying to use VPNs because it was only available in the US. They were still like, "No, you got to be on a wait list." Then I finally got it and I was like, "Oh, really?" I would always ask it to create a car driving on an Alpine road, like a twisty road, dynamic shot. And the car would be driving backwards.

    Katie (13:36)

    I think people might know that's AI.

    Noel (13:40)

    Exactly. But that's always been my test with it. Every new video model that comes out, it gets that same prompt and I always see if it can do it. But these days it's a lot better. Sora 2 is better. Obviously you can get slightly longer lengths as well now because they were capped at like four or five seconds initially. And Veo 3 from Google is really good as well. They seem to be pushing each other along in those advancements. You do have to check yourself sometimes when you think, "Is that real? No, it's not real."

    Katie (14:14)

    Yeah, so I would say images and videos have just come a long way. Actually, before we move on to the next thing, something I have noticed as well is AI images have become so good that stock image websites are now putting AI images on as stock images. They've become that good. If you think this time last year, there's no way stock image websites would do that. And now it's really hard to know, is this AI, is this real?

    Noel (15:35)

    No, because I remember Adobe brought out a thing where you could submit stock images, but you could tag them as AI generated. So you could sell your AI generated images on their platform, which is crazy.

    Katie (15:46)

    Yeah. And I'd imagine most stock image websites and memberships have an AI image section, probably soon AI video section as well.

    Noel (16:05)

    Yeah, absolutely. It's come really mainstream, hasn't it? It's bonkers how far it's come.

    Katie (16:17)

    Okay. So Noel, I feel like you're quite excited about the next thing we're going to be talking about, and that is agents.

    Noel (16:26)

    Yes! Who knew? We didn't really know about agents at the start of last year. I just can't get over that.

    Katie (16:35)

    Yeah, that is mind-blowing almost, isn't it?

    Noel (16:39)

    It is. Because I remember it was late 2024 and a lot of the AI and automation YouTubers got into n8n because they had this magic agent module. And then that kind of exploded.

    Katie (16:43)

    It was all buzz around the no-code AI agent builder, wasn't it? In the AI world, those buzzwords were just flying around left, right and centre. It was everything, and if you weren't talking about it, you were almost like, "What are you even doing?"

    Noel (17:18)

    Yeah, exactly. It's crazy really. But I remember trying it out with n8n at the very start and it wasn't very good. It kind of did a job, but you can give it tools and access to like your Google Calendar or your inbox. And then half the time the agent would forget to use those tools or just make up the answer. So there were a few times I was chatting with a client and she took out her phone and said, "Could you create a meeting for me and Noel next week at this time? This is his email." And she sent the voice note and it came back saying, "Yeah, it's all done." We just sat there looking at our calendars going, "No, no it's not." It's saying, "I've sent it to Noel. He's got an email. I've sent the invite." We're like, no. But thankfully it's come on a long way. n8n have made a lot of improvements recently. It's far more reliable. But agents went absolutely berserk last spring. Everything seemed to need an agent, although you didn't.

    Katie (18:55)

    Yeah, everyone thought they needed an agent because the buzzword was being thrown around so much. Everyone thought they needed an agent instead of just a basic automation.

    Noel (19:03)

    Yeah, exactly. We had lots of examples. I remember when OpenAI first brought their agents out. One big thing was you could get it to go off, do loads of research, and then create a PowerPoint presentation at the end of it. That was one of the real big use cases. And I remember doing it. It must have took about 25 minutes for it to do the research. I was like, go away, have two cups of tea and then come back. And then when I opened the presentation, I was like, "What on earth is this?" Nothing was formatted. It just looked absolutely awful.

    Katie (19:46)

    It was like a five year old could have actually done something better.

    Noel (20:01)

    I should have fired up Microsoft 95 and used it with a bit of WordArt.

    Katie (20:11)

    Gosh, I mean that's quite in at the moment to be honest, that whole nostalgic 90s vibe. So you would have been right on trend. You should have gone with it.

    Noel (20:18)

    Exactly. Who knew? But there were all kinds of things. So then we'd also get agents that can control your browser. Now we've got AI browsers from OpenAI, from Perplexity. Claude have got Google Chrome extensions to control your browser. So it can go off and do your shopping for you online so you don't have to. Not sure how much I'd trust it to do that. But I would say it's become a lot more reliable than what it was within the last 12 months.

    So Make.com obviously rushed out their version of AI agents. I went to their conference in November 2024 and they made no mention of agents. And then come March, they'd released an early beta version because of n8n and all of its popularity.

    Katie (21:30)

    But that just shows how fast the AI world moves, isn't it? You think about the Make conference that we went to in November 2025 and how they were talking about all of their upcoming things. I feel like they've almost got the first half of 2026 planned out for their developers to work on behind the scenes. Yeah, it would kind of be like imagine they've planned everything out and then someone just goes, "Hold my beer. We've just released this." And it must happen to so many of the different software platforms. They're just like, "We had just planned these massive updates and now we've got to try and create something else." It must be quite exhausting actually to keep up with all the new updates. And that's what we try and do on this podcast as well. We try and keep people up with updates that actually matter and are going to have some impact on you and your business, rather than just the micro updates that don't really have that bigger impact.

    Noel (23:05)

    Yeah, exactly. I released my iOS app at the end of last year. It controls AI agents on your phone from Make and n8n. I posted it on LinkedIn and then the head of AI at Make.com liked that post and I was like, "Whoa, don't you go stealing my idea!"

    But yeah, they've got to move so quick. With Make, they definitely do move really quick. They release things, put it out for testing, and I usually get closed beta access to some of their new tools over the last year or so. So I can say, "I've spotted this bug," and they're like, "Brilliant. Thanks so much. Our team is already working to fix it." Or I'll say, "I think we should add this feature," and they're like, "Brilliant. That's kind of what we were thinking, but we haven't added it in yet. Good to know. We'll add it in." They're really receptive, which is awesome.

    But I guess also with agents, one thing that didn't exist this time last year was MCP, which is the Model Context Protocol. It's a bit techy, I know, but when you connect tools to an AI agent, you would normally connect it to an API. So every single thing—you want to create a line in a spreadsheet, that would be one API call. To delete a line in a spreadsheet would be another one.

    But with MCP, you could give access to the entire range of APIs for a platform and then go, "You know what AI agent, you figure it out yourself." And it looks at it and goes, "Right. So Noel's asked me to delete a row in this spreadsheet. What is that in the MCP? Use this. This is the information I need." And it goes, "Fine. There you go. Do it." And it does it. It's just incredible how that works. Normally we'd have to connect tens of different tools to these agents, but now you just connect one MCP and then grant access to all the different tools you want to give it. It really speeds things up.

    Katie (25:42)

    Yeah. I can tell you're really excited about that one, Noel. I feel like you've geeked out on that.

    Noel (26:04)

    Yeah, at the time I didn't really understand what it did and then when I really got into building agents I was like, "Yeah, this is pretty cool." It does use up a lot of tokens on your API, but it's a bit of a trade-off. I also find, as a top tip, OpenAI models aren't very good at using MCP. Anthropic, they get it right every time in my testing.

    Katie (26:38)

    Okay, good tip. Excellent. So then moving on to the automation platforms.

    Noel (26:43)

    Yes, lots of big updates, especially with Make.com of course. They released the Grid in the summer last year. That allows you to visually see your entire account on a real top level view so you can see what's going on. They've added things where you can see which scenarios in your account are using up all of your data or all of your operations. Or even highlight errors where there's things that have failed and it flags it up. You can zoom in and click on it and then open it up. It's a really nice feature.

    Katie (27:27)

    But when you think of it, you think that is actually quite a basic thing to have, but it just didn't exist. And then you're like, "Well, yeah, of course I need that."

    Noel (27:43)

    You didn't know you needed it until they showed it to you. There were lots of little things on there and I was like, "It'd be good if it showed data use." And I fed that back in the beta and they're like, "Yeah, definitely." And then it's like, there it is, there's the toggle switch for that. That's handy, thank you very much. Because that's kind of hidden on the Make platform. You can see your overall usage but finding where that usage has gone can be quite tricky. So I'm glad they've updated that.

    And I think for next year, maybe the year after, they did mention about being able to build automations within the Grid from a top level. So I'd be interested to see what that looks like. That was kind of very quickly mentioned and then, "Let's move on to something else."

    Katie (28:47)

    Yeah. Moving on. Probably because they just have it as, "This is what we're going to do, but we don't know how we're going to do it, what it's going to look like, when it's going to come out. No questions please. But we will mention it."

    Noel (29:04)

    Yeah, exactly. Hopefully no one would notice we mentioned it.

    They also moved to hosting their own AI models as well. And they changed the complete billing system from operations to credits, which really confused me at the time. I was like, "I don't really get it." And then I had a chat with the head of AI and I was like, "I get it now." Because they're bringing out their own AI models—well, they're not bringing out their own, they're using OpenAI models currently. And then your credit usage goes towards the usage of the AI instead of you having an OpenAI account and connecting that in.

    Katie (30:00)

    Yeah, like you're almost using OpenAI on Make as a third party.

    Noel (30:07)

    Yeah, definitely. Which is really cool. And they've released lots of different mini modules. They do categorisation tasks. So that's something you would normally put in an AI model in the middle and then you'd have to do all the prompt engineering and figure all that out. Whereas now all of that's done. You just put the information in and tell it what you want to come out the other end and it's all there and good to go.

    They've also got their data extractor as well, which is a pretty cool module. I did feed back that it's a bit pricey on the credit front, to which they agreed with me and said, "Well, we're kind of losing a bit of money every time someone uses it." I was like, "Fair enough." Because I think it was 20 credits per page at the time. And then within a week they said, "Thanks for your feedback. It's now down to 10."

    Katie (31:12)

    50% off! Any other automation platform updates you want to highlight from 2025?

    Noel (31:30)

    So I guess the big one, it's not really a prediction so I won't leave it for that bit, but obviously Maia is something huge that's coming out this year for Make. So Maia is going to be a way that you can have a conversation with an AI agent and it will go off and build the automation for you. So you don't really need to know that much about the platform in order to use it. From a beginner standpoint it's great, but I think we're going to get to a point where you've got those that have the skills to do it normally and then those that don't and rely heavily on Maia to do it for them. And I guess that kind of comes up with vibe coding and things like that. I do a lot of vibe coding, but I'm not a software engineer. So yeah, I play at it.

    That's a big update that's coming. And also they've released a new version of their AI agents this year as well. I do have access to that currently, but I'm not allowed to talk about my thoughts and opinions on it. So we'll swiftly move on.

    Katie (33:01)

    Well, we'll save that for when that is allowed. I feel like we should move on swiftly because we don't want to get into trouble for saying anything more.

    Noel (33:07)

    Yeah, I could say you can build it visually because they've shown it at Make Waves last year. It's a cool builder. But we'll move on.

    I guess the last big update really with n8n is they released a lot of incremental updates. It seemed like every time I opened n8n, there were five or six updates that needed adding. They're adding lots of different features. Towards the end, I think it was December, they completely changed the UI. It kind of still looks the same, but it looks a bit more modern. But one of the big things they brought out was Chat Hub. That basically looks like ChatGPT, but you can chat with your agents using that interface and you can chop and change between them within the conversation, which I thought was really cool. That kind of spurred on my app that I built at the end of the year. I thought that's actually a really cool feature. I'll add that into my app.

    Really awesome to see that. I don't know if Make would bring something like that in. But who knows?

    Katie (34:39)

    I feel like if one brings it out then the other one does, just with a slightly different spin on it. So I would imagine so.

    Noel (34:45)

    Yeah, exactly. But I guess, well, a shameless plug, but where my app differs is I can connect to Make and n8n. I don't care who uses what. Whereas n8n are never going to let you connect Make agents, and Make will never let you connect n8n if they built a similar platform.

    Katie (35:04)

    Okay. Well, we will put the link to your app in the show notes. Go and check it out.

    Okay, so I feel like we need a little bit of a drumroll here. 2026 predictions for AI and automations. I feel like even in about six months time, if anyone's listening to this, you might be able to have a good chuckle at what these predictions are. Because as we know, in the AI world, it is a bit like the wild, wild west and anything can happen. Anything goes.

    Noel (35:51)

    Expect the unexpected, as they say. I'll have to quickly check my phone. I've got no updates currently to tell me there's something new come out. So we're good for a minute.

    Katie (35:53)

    Quick, before something else gets an update! Okay, go for it, Noel.

    Noel (36:11)

    I think for me, one big prediction is by the end of 2026, Google will be the leader in AI. So I think they've got the edge. For the past few years, I always felt that Gemini models have been playing catch up with OpenAI. And I've been using Gemini 3 a lot recently. I use Claude Code a hell of a lot—Claude Code does all my iOS development. But I've been building things in Google AI Studio, which came out last year as well. You can build apps for free on there. So I've been doing lots of different things with Gemini models.

    I just think with OpenAI, they've got to build the infrastructure in which to train the models and have people use it. Google have already got the head start on the infrastructure. They've got all of their cloud services, they've got data centres all around the world. They're kind of already there.

    Katie (37:47)

    Yeah, well they're just so much bigger than the others that are doing it.

    Noel (37:55)

    Exactly. Huge. But I just think with that and the money that they bring in from other things, they'll very quickly become the leader. Because with OpenAI, they kind of rely on the revenues coming in from people using their API and ChatGPT subscriptions in order to fund what they do. I mean, they've got $500 billion from Trump last year to help them build these data centres. But I can just see Google taking the reins and kicking on from this point. I think they've caught up now.

    Katie (38:30)

    Yeah, but again, they're just so much more established. They're such a huge company. We've passed Google offices in the most random of locations as well. They are literally in every corner of the globe.

    Noel (38:56)

    Yeah, exactly. And most of the things that they release is generally free to start off with. OpenAI aren't going to do that. Obviously they've got free ChatGPT accounts and things like that, but they usually gatekeep the big features for people that pay for it. I just feel it's going to start taking over.

    Katie (39:28)

    I mean, it makes sense right now. But again, who knows what's going to happen next week, next month, next quarter.

    Noel (39:35)

    Yeah. Because I think a prediction as well is that OpenAI are going to get to a point where the cost, not only to build the infrastructure, is also going to have a negative impact on consumers. It already is. It's going to get to a point where consumers are going to start pushing back and be like, "Why are these everyday goods going up in price just to feed your data centres?" So I think we're going to hit a pricing ceiling. They might stall just a little bit. Who knows? And I think Anthropic are kind of in that space as well because they've got to build the infrastructure and rely on someone else to do it for them.

    Katie (40:45)

    I'd just like to say as well, Noel refused to tell me what his 2026 predictions are before recording this podcast. I was like, "Okay, so 2026 predictions, do you want to talk about them?" And he was like, "No, they're a surprise." So as you're listening, I'm actually also finding out what Noel's 2026 AI and automation predictions are. Is this as much a surprise to you as it is me?

    Noel (41:17)

    As it should be. So I guess my next one is more AI physical products for 2026. So I know we kind of just touched on this.

    Katie (41:31)

    Is that because you heard about the leak? Yeah, go on.

    Noel (41:38)

    Yes, I was going to mention that. So obviously last year, OpenAI—I just downplayed them for not having infrastructure and relying on stuff. This could also fund their development as well. But they bought out the company run by Jony Ive. He's the lead designer for all of the iconic Apple products like the iPad.

    Katie (42:07)

    Oh god, I just love those. Some of the really old MacBooks with the clear backs to them with the colours and stuff. I still want one. I saw one really recently and I was like, "Oh my god, I so want it."

    Noel (42:07)

    Yeah, so he's designed all of these incredible things. He did the iPod as well. And I think the first MacBooks as well, now I think about it. There's a lot of things he's had his fingers in and they're incredible products.

    Katie (42:26)

    I mean, I'm a big fan of his work then, in that case.

    Noel (42:48)

    Exactly. The leak that I saw the other day—because obviously last year when they bought them out, they said, "We're going to be bringing a product to market. We're not going to tell you what it is."

    Katie (42:58)

    Yeah, they kept it all hush hush, didn't they?

    Noel (43:02)

    Probably mid-2026, early 2026 maybe, they're going to tell us what it is. At some point. So they were like, "We're not telling you now."

    Katie (43:07)

    Okay, I understand you now.

    Noel (43:26)

    So we haven't really got that long to wait. But the leak I saw was it's potentially going to be like a normal pen. Which sounds ridiculous, I know. But as you write on—I don't know if you would need some sort of special pad to write on—but as you're writing, it would then populate it into an AI ChatGPT-style interface. So it's now not limited by your typing speed. Your interactions are limited by your writing speed, I guess. But then they also mentioned there could potentially be a microphone at the end of the pen. So you could be writing and then just click a button and stop talking and then have a two-way conversation with AI at that point. Which I thought was quite clever.

    Katie (44:22)

    Yeah. And you think other products will come into play as well this year?

    Noel (44:31)

    Yeah, I mean we've seen the AI glasses by Meta and Ray-Ban and I think that they're going to come on leaps and bounds in the next year.

    Katie (44:45)

    I've seen as well, obviously when we were at Make Waves, there was a talk from someone from Meta and they were talking about the new Meta glasses update which looks awesome. But I've also seen a lot of content creators wearing the Meta glasses and actually going into places and being told to remove them. But what scares me a little bit is that if they create something else—maybe it isn't something wearable like glasses, it's something else—it's kind of a bit like, well, then you don't actually know when you're being filmed. That scares me a little bit if they go in that direction.

    Noel (45:34)

    Yeah, that's true. I get that with the privacy because obviously the cameras on the Ray-Bans are so obvious when you spot them. And I get why they need it, but it's a bit odd, isn't it? But back in the day, Star Trek always used to come up with these ideas that seemed completely bonkers. They had the first flip phone and then the late nineties, there's flip phones. I was like, "I can't believe that's what Star Trek used to do."

    So I'm thinking there'd also be products where you can communicate with AI just by talking to it normally. I mean, you've obviously got Alexa and things like that, but something that's wearable that you could just take with you. Could be AirPods so you can chat to it and it would just talk back to you for that AI function.

    Katie (46:52)

    Yeah. I'm slightly excited, but also a little bit terrified at the same time.

    Noel (47:05)

    Yeah. We'll have to see what crazy stuff they come up with.

    Katie (47:13)

    Any other 2026 predictions that you want to share with us?

    Noel (47:18)

    So a big one would be AGI—artificial general intelligence. Everyone's kind of pushing and saying, well, they've always said that in 2026 they reckon it would happen. But I don't think it will. I'm going to say it's not going to happen at all in 2026. I don't think we're quite ready yet.

    Katie (47:44)

    What do you think? It's going to be delayed to like 2027 onwards? Could do. It's possible. Okay.

    Noel (47:48)

    Yeah, unless Google play an absolute blinder with the infrastructure and stuff. But I just don't think we're quite there yet. Especially on the mass market level. A lot of people are still just getting into ChatGPT and stuff.

    Katie (48:11)

    Yeah. I mean, some of my friends were using ChatGPT for the very first time in the summer of 2025. So yeah, that was interesting because I was just like, "How do you not use ChatGPT?" It's so woven into my life. It's kind of a bit like—I always use the analogy of the internet. It's kind of like, how was someone not using the internet almost straight away? Really interesting. So I feel like definitely more people are going to be using LLMs. That's going to be really interesting.

    Noel (49:04)

    Yeah, definitely. I think my other prediction is I don't think we're seeing AI impact in the job market just yet. Not noticeable anyway. But I think this year we might start to see things creeping in where employment might be harder to get. Because startup companies would be like, "I could hire somebody to do it, or I could just train an AI agent to do that function for me." So yeah, I think 2026 could be difficult in some areas.

    Katie (49:45)

    And do you think that's like the low skilled jobs?

    Noel (50:00)

    I guess so, yes. Or jobs that require manual data entry or those sort of things that you could easily now automate or look up with AI. So I think a lot of jobs are fine, but I think we'll start to hear and see it a bit more this year, unfortunately.

    Katie (50:26)

    Yeah. And we have actually got a really interesting episode coming out next week with Ashley Kay, where we talk all about using AI within work versus not using it. And I'm really excited for that episode to come out. So I feel like we'll talk more about it next week with Ashley.

    Noel (50:31)

    Definitely, yeah. It's a good episode, that one.

    Katie (50:54)

    Yeah. In the wild, wild west.

    Noel (50:58)

    I think those are probably the biggest ones that we can predict at the moment. So yeah. Who knows what it's actually going to be. But I'm excited to see what happens, to be honest with you.

    Katie (51:09)

    Yeah. I'm really excited to hopefully be sat here this time next year doing a similar episode going, "My goodness, this is all the amazing things that happened and no one saw any of it coming."

    Noel (51:28)

    Yeah, exactly. There's going to be some bonkers stuff. There'll be more crazy trends. Definitely.

    Katie (51:35)

    Well, thank you so much for listening to this week's podcast episode. We really hope you enjoyed it. If you did, please leave us a review. It really helps us with our podcast and to be seen by more and more people. If you've got a friend that you think would enjoy it, please share it with them. It would mean the world to us and we will catch you next time for another episode very soon.

Previous
Previous

S2. Ep2 - Can AI Replace Your VA? A Conversation with Ashley Kay of Champagne Collective

Next
Next

S1. Ep42 - Why Only 6% of Businesses Trust AI Agents (And How to Fix That)