Ep. 95

How Interact AI is Getting Built with Team Interact

This episode features Interact Software Engineer Graham, Digital Marketing Manager and Host, Jessmyn Solana, Customer Success Manager, Damaris Pacheco, and Social Content Manager, Jesy Nelson.

  • In this episode we will cover:
  • How do you keep up with how it’s evolving?
  • How do you build a system where it’s updating and evolving but you also don’t try to do everything that people ask?
  • What do customers actually need and rank the difficulty to build those things with engineering?
  • How do you prioritize feature requests in a fast-paced software company?

Ready to put our AI-powered quiz maker to the test? Get started here!

Hi guys, and welcome back to Interact’s Grow podcast. So great to be with you all. As always. I’m your host, Jessmyn Solana. We’re missing a couple people today, but we do have an add-on. So with us, we have Damaris as usual, Jesy  as usual, but we have the Great Graham. I didn’t mean to do an alliteration. But Graham is from our engineering team.

He’s been with us how long now? I’ve been here officially since last October. But I was contracting about, I started contracting about a year ago for you guys, and then joined full time in October. I was gonna say, I was like, wait, you’ve definitely been here longer than a year. But we are doing. A, an episode this week where we’re gonna ask Graham some questions about the product more specifically how the Interact AI tool is being built.

But before we get into all of that, Graham, can you tell us a little bit more about you and your role in engineering at Interact? Yeah. So like I said, I’ve been here for about a year. October full-time. But yeah, I’m a software engineer here and I work with two other amazing engineers and we just crush stuff out.

We are constantly coding and building out new features and listening to, you know, you guys and the customer and just trying to build the best product that we can. So for me, I’m just every day planning stuff out, features, and then writing the code for it and deploying it and monitoring it. I love that.

So you’re deep in sort of what the product looks like behind the scenes, and right now we’re building the Interact AI tool. Can you give us, I mean our listeners have heard everything and anything about what it does, why we’re doing it, but how is it being built? So do we want to get into like the technical stuff or just like high level?

Let’s start with high level. Yeah, so you know, this, this idea, I don’t know when we sort of came up with it as a team, but I know that Josh came up to me and, and was talking to me about it several months ago about sort of, Hey, there’s this great chance you can tool, we can use AI to do all of these different things.

And we started discussing it and I’m sure he was discussing it with you guys and other members of the team as well, but, We know that one of the issues that people have when they first sign up with Interact is just that initial process of getting your quiz together, trying to get onboarded and publishing your first quiz.

I know for me personally, when I first joined the team and I tried to create a quiz, I just tried to do a very basic quiz about engineering, right. Just to see what the product was like, and I was getting stuck doing it. I was so confused as to whether or not I was asking the right questions or wording it correctly or whatever.

And so, Josh came to us with this idea that we can use AI to help onboard people. We can ask them questions so that we can create a really robust quiz for them right off the bat and get them to publish it way quicker than when they go through a manual process of having to do it themselves, perhaps reaching out to customer service, whatever it may be.

So it might be two, three weeks before they actually get their quiz out, whatever it may be. Whereas now, you know, ideally we’ll be able to get people. A quiz and it published within minutes. And so removing that barrier to entry was, was a big motivation for us. Yeah. I love that. Jesy, did you have. No, I just, I think it’s just crazy that like you guys just discussed and talked like, oh, this would be really cool to do this, and really this is such a game changer for our whole business as a whole, and like the quiz, kind of like industry, I feel like, and I think it’s just cool that one little conversation turned into we’re doing this, like you guys hopped right onto it.

I feel like, so it, it, it might seem like, like we did. I think there was a lot of behind this, behind the scenes discussion right before we actually started sitting down and formally planning out what the MVP looks like for, for this, this product. And I. We, we had a lot of conversations about the customer facing features and what it can do for the customer, but the engineer in me also was really excited to start working with chat PT as well, just because everybody is talking about it nowadays and it’s the new and latest, greatest thing.

With that said, like we, we do have concerns around it, just as a team and AI in general, and I think that’s something that our customers should know is that. We’re not offloading the responsibility of helping our customers to some sort of computer algorithm that’s doing stuff behind the scenes. We wanted to create an experience for the customer where they have input and then we’re also reviewing those quizzes and we’re always available to help them tailor and tweak those finer details of what the AI machine spit out.

And so it’s not just an AI machine that’s building this for them. It truly is. A tool rather than the whole thing, I guess. And that’s something that we had discussed before we even started building this. Okay. Right. Like how do you keep you know, that human element when you’re basically putting a robot at the forefront to help people create a quiz is Yeah.

That you, that’s the challenge. Yeah. That’s definitely something that, We will continue to iterate on as we release this out to public and people are seeing it and giving us feedback. But I think, you know, some of the discussions that we’ve had around sort of making it more, he more human, I guess, is ensuring a, that we’re always available and that they know that to discuss their quiz with someone in person.

I think that’s. One of the biggest delineators with our company versus maybe some of the other competitors out there is, is our level of customer support, right? We have the ability, the way that we’ve built the system to constantly change the questions. Based off of the feedback that we’re receiving from from customers.

So if we’re not asking the right type of question, or it’s not getting their voice into the way that a question’s being asked on a quiz, right? Like we’re constantly able to take their feedback and tailor the questions and the prompts that we feed to the AI machine so that we can get better results.

And then the other thing too is just the way that we. Explain how this works to the customer. That might be a separate landing page for how the AI works, blog posts, things like this, right? Making sure that we have the right copy down on the page so it doesn’t feel so robotic and static when a customer’s going through that process, and, and again, like we’re still working on the mvp, and we hope to have that out sooner rather than later, and we’ll continue to iterate on that and make it more like they’re, they’re actually working with someone and not a machine.

So since you have like inside info on how it’s actually being built, what that process looks like, let’s say, right? You know, there’s all these other companies coming out with their own ai, like Notion has an ai help me out guys. Who else? Just release an AI Gmail. Google. Yeah. Google. Yeah. God, it was at the tip of my tongue.

All these companies, right? So. In terms of like how long it takes, how are companies like that able to pump out an AI so quickly while we’re still in assisted beta? So, I mean, some of those companies like Google, right, they actually are creating the AI engines and helping the algorithm and whatnot. We’re, we’re, we’re not doing that.

You also have to remember that these guys have tons of money, huge teams. To iterate and rapidly develop on things and they’re able to dock, feed it to, you know, their own team, which could be thousands and thousands of, of teammates that are, are testing this out. Whereas right now for Interact, it’s just three engineers building this as well as the rest of the team having input on it.

And you know, as a, as a, as a bootstrap company too, like we need to be very thoughtful about how we approach things and. We thought as we were going through the product planning and just about all the features and products that we wanna build, we looked at the AI stuff and we said like, how important is it to get this out as fast as we possibly can?

And we all agreed that it’s important to get out there and that’s something that we wanted to do this year. That was one of our goals. But we also, Felt like it was important enough that we didn’t wanna rush it, but we don’t need to get it perfect on the first time either. Like, we understand that we need to iterate and continue to create new versions of it, and so we’re sort of taking like a faster pace, but not super, super fast.

And, and we’re okay with that. Right. That’s a, that’s a conscious decision that we made as a team by looking at a bunch of different factors. Yeah. I actually have a question like, so you’ve seen like in notion, for example, how it can do the ai, like, you know, you start typing something and it’s like AI will fill it out for you.

Are these companies that have these features, they, they’re not building their own ai, right? They’re like somehow connecting back to chat G P T or Open Playground and using that, I’m assuming, right. Yeah, I mean, I obviously don’t have the insights on, on what goes behind the scenes with those companies, but I’m assuming that they’re hooking into an AI platform.

They’re not necessarily creating their own AI engines themselves. They might be. Feeding their, their data model with certain things, which we don’t do that. At least right now, and I don’t, I don’t think we’re going to, I think we want to be very cautious about how we handle data in people’s privacy, which I know something that we had talked about too during sort of the, the pre development phase was, was making sure that we’re, we’re really safe with customer data and our own data as well.

So but yeah, I don’t think that, Every company out there is creating their own AI engine. I mean, that’s quite a, that’s a really big task. Yeah. Is there a benefit to everybody like tapping into an open ai or, I’m assuming Google’s is a totally separate bar. Ai Yeah. Bar, yeah. I mean, so I’m not an AI expert by any means in terms of like how the algorithms are built and how the machines are built.

My understanding though, is that the more information that you can feed these models, right, collectively, whether it’s it’s us putting data in for quizzes or it’s the worldwide web, whatever it is, the more accurate it, it, it can become, right? Because at the end of the day, we are training these models.

It’s not like the models are just. Thinking for themselves, although I know that the concern down the road, but, you know, they’re using, they’re using probability in statistics to be able to scrape the web and the data that it’s been fed to be able to come up with finishing a sentence, for instance, or for us, you know, based off of the prompts that we give them, what a quiz should look like for this customer based off of how they answered.

So collectively there is a benefit I would say. More data being pushed in from everywhere and anywhere. I love that you mentioned the prompts, because I know we’ve talked about prompts before, but I don’t know Jesy  and Damaris, did we ever say like exactly what it looks like? Because when you write a prompt, you do say like, the quiz should look like this, or like, the outcomes should look like this.

Now write outcomes that, you know, have something to do with this topic and whatnot. It’s not just like already knowing that you really have to be really specific in what you want it to look like. So, kind of alluding to how you’re saying you have to feed it, you can won’t just like come up with that on its own.

Yeah, yeah. There are other developments that are being made around that, but for us, like, yeah, the prompts are super important. So like mm-hmm. I have a question for you guys. When you, when we first started doing this, this very manually, obviously, how, how much have the prompts changed since we first started?

That’s a great question. I was actually just thinking, that was one of my questions for you, Graham, because. It’s been a lot of trial and error, I think for, for us as we’ve been doing these quizzes for the last six months. I think for me it’s changed like probably two to four times of the, of the way that I write in.

Certain things, like I know the, the, the, well the document that we have currently right now, I think I only use, I don’t use the first two prompts there anymore. I started using Jesy ‘s phenomenal prompt that she introduced us all to, and then that’s sort of what I base mine off now. Or if I feed one of the prompts and I don’t like the answers, I’ll just say, Hey.

Rewrite this, but make it more human. I don’t know, it just, it just changes so much. And so I find it interesting that you asked us that because that was actually one of my concerns. As you guys are building out the ai, how, how would we communicate because we’re doing so much of the manual work right now, the where if I don’t like it, the way something gets spit out, I can just go in and just.

Rewrite this or rerun the prompt and then just delete it and then redo it again. Right. And so that was sort of interesting that, that you brought that up. But I think it’s changed. To answer your question, Graham, it’s probably changed like two to four times for me. And I think it’s changed because we’ve gotten more comfortable with using it.

We’ve mm-hmm. Like the more we’ve all used it, I think the more we know like, oh, like we can’t, we can do more than just enter this prompt. Like we can kind of customize it, you know, it can do so much more. We didn’t realize that if we dropped in someone’s website link that it could read their website and their about page.

We didn’t know that it can like see all their products and services and offerings. Like we used to copy and paste that in, you know, to tell it. But instead we’re like, this is my website. Here’s the link. What is the primary concept of what I do? And it. Goes through their website and like gives us a great idea of what they do.

Mm-hmm. So I think they’ve changed, or our prompts have definitely changed based on like the more we’ve been doing it and how much more we’ve learned about it. It’s somehow more simple, but like we could put more information in there. Yeah. More we made it more simple, but more like accurate. I feel like the way that we’re doing it now.

Yeah. So I think you guys are bringing up a, a really interesting point is that, The engineers are coding the process that you guys are currently going through. Mm-hmm. So at the end of the day, this is truly a team effort because you guys are on the front lines, working with customers manually entering stuff into the AI playground, figuring out what works and what doesn’t work.

And then based off of your guys’ experience, Josh was getting back to us and saying, this is, this is what the rest of the team is seeing as they manually build out these quizzes. And so we need to account for that in the code that we write as engineers. Mm-hmm. So the mayors you had previously asked for, how are we going to.

Bridge that gap between the knowledge that you guys have and then what we’re doing as engineers. And it’s really funny that you brought that up cause we just, during our product meeting last Thursday, we just had a discussion about taking some time and just sort of being hands off from the keyboard and just asking you guys questions and observing what you guys are doing to ensure that we are creating the, writing the right code based off of what you guys see works best.

Mm-hmm. Yeah. That’s, I mean, that’s great and, and I think it’s just because, especially with like ai, it’s, I feel like it’s ever changing, right? It’s one of those tools where like, it just, it constantly changes every week, constantly. Upgrades, there’s different things that we can feed it, you know? And it’s just, I, I don’t know anything about the engineering world or coding or what that entails or how it works.

So us being sort of customer facing. It’s nice to have these conversations with like, how do we translate what we see versus, you know, versus the backend system, how you guys are building it, you know, it’s just, it’s very interesting. Well, the website thing that Jesy, you mentioned was interesting because Josh had brought that up last Thursday during the product meeting, and it was something that at least I.

Hadn’t really heard too much about and didn’t, certainly didn’t know that you guys were, were, were using that feature. And so that is one of the things about being an engineer. I think just naturally that the, the three of us and most engineers that I do know are, are, are curious about the technologies that they’re using and the rapid advancements, like you were saying to Maris, that it’s constant, the landscape is constantly changing, but it’s not onerous or an arduous task to keep up with it because we’re just naturally interested in it, but, I see that just from what Josh was telling us last Thursday and, and, and the website thing, it seems like you guys are just really curious as well to be able to exploit all of the potential that the AI system has to offer us.

And so, you know, if you guys tell us that we then have the ability to go back and look at documentation, read some of the more technical details. And validate whether or not it’s something that we can introduce into our product, or if it’s something that it’s a little too bleeding edge, maybe we should hold off or it’s not quite doing what we would like it to do, but it’ll eventually get there.

So again, I think, you know, for, for the people that are listening, you have to understand that yes, like we’re coding it as engineers, but it really is a team effort and you have to have that constant communication between, you know, different departments and different teams within a company. Yeah, that’s super important and something we talk about all the time on the podcast, but it’s like, great that you’re coming on here saying it.

Cause we get so many feature requests all the time and like, I know it’s super frustrating for customers to maybe not get the feature that they wanted or that like, you know, maybe we started one year but realized like we couldn’t. Make it or like wasn’t you know, just wasn’t gonna make it in the roadmap this year for whatever reason.

And I think like with the AI tool, like knowing how it will be used by customers first, by like, people you have direct contact with, which is us, is like super important because, you know, they’re, they’re gonna run into the same, you know Points where they get stuck or they’re gonna run into the same questions, they’re gonna run into the same like, Hmm, that didn’t quite look like the way I thought it was going to.

But if we could get it to a point where we could answer most of those questions and build a tool that they could actually use, then it’s kinda like with our quizzes from there, like, we’re just going to tweak and improve it over time based off of like, you know, what, what does it, what does open AI look like?

Like how is it being used? You know, what kind of information is it spitting out now? What kind of customers are, you know, interested in using ai? There’s like a ton of people who maybe are scared of technology, but how do you make it like user-friendly to the point where, you know, they are like, oh, I just click a button and it, it goes.

Mm-hmm. Yeah. It’s, it’s all from the product side of things. It’s always very difficult to balance all of the different feature requests, both internally from you guys or externally. And I think the way that we try to handle that is we look at a number of different factors. One of course is what is the level of effort it takes to implement something?

Are there other things that can give us sort of, you know, the same bang for our buck if we introduce another feature that keeps customers just as happy or something that they really want. But I think. At least the way that I look at it, and I think the team does too, but I, I don’t wanna put words in their mouth, is I always look at what are Interacts core competencies?

What are we enabling customers to do? Like they need to be able to. Create a quiz. That means the questions, the callers, the images, all of that stuff. They need to be able to publish it on different mediums so that their customer base can take those quizzes and they need to be able to see the results.

And so for us, the AI specifically hit one of those core competencies, and then that’s the ability to get your quiz out to customers because you, let’s just say you signed up for interacting and spent. Two months trying to build a quiz, but you never actually published it. Well, that’s, that’s a huge waste of time for, for the customer, right?

For the person that’s creating the quiz and they’re not getting anything out of it. They’re not fulfilling that core competency that we say that we offer. And so the AI does that. It allows people to get a customized quiz based off of their answer, their, their answers to the, to, to the questions. And gets it out there to, to their customers as quickly as possible.

So that was always very, very high up on our list when we went through the product, sort of the the year product plan and what we wanted to accomplish. Yep. That’s interesting. I know, it’s like, it still, it’s crazy to me how we’re doing this and that. Like I just feel like. Anytime I think back I’m like, I never would’ve thought that this would be happening, and we’d be doing this with AI after everything.

Yeah, they can. Sorry, go ahead. No, I was gonna say, and it brings up an interesting point for the listeners out there, especially for the ones that have interacted with me through chat or through email with asking for specific features or questions about, you know, why we do this versus something else.

You know Graham brings up a good point that, you know, sometimes we have to take president over certain features over other ones just because we have to look at. The major concern or the top priority of the company and for our customers. And I know one of the highly, for all of you guys listening, we understand a lot of people want ranking and categories, and that’s like one of our top requested features of, of all time, I think.

But. Sometimes I, I believe, I don’t know, maybe you can talk about this a little bit of Graham. I think it’s more complicated than what it just sounds like, and so sometimes certain projects just take. A little bit longer to sort of understand the, the behind the scenes of what it actually entails. And, and for our, our, the business of quizzes, I think AI took sort of the top priority this year when we introduced that out early this year.

So I don’t know if you have a little bit of more information, a more insight graph. I wanna, before you answer that, I wanna like, add to that question slash kind of clarified a little bit like, When you’re looking at what projects you’re like engineering decides to do for the year, like why would something like Interact AI take precedent over features that were requested for like years and years and years and continue to be requested?

Yeah, I, I wish Josh were here to answer that, cause probably he would answer the business part. Yeah, he, he could definitely answer a little bit more about the, the historical requests that have come through. But from what I’ve gathered in our conversations in the, in the year that I’ve been here, one of the biggest blockers again, is people getting their first quiz created and pushed out to their customer base.

And so for us, we’ve just felt like that was. The biggest concern in terms of getting new people onboarded people that are existing, customers that may want to create a new quiz, right? Limiting the amount of, of, of questions that they’re going through when they’re manually creating something. We just felt like that that was the most important thing going forward.

And again, that’s not to say that these other requests aren’t important, but we felt like this would be able to. Really help a large, large portion of people that are coming to Interact. And Damaris, what, what was your other, you had a question as well, I’m sure. I feel like reference, I feel like you were like going toward asking him about how like people have feature requests and it seems like a simple kind of thing.

Yeah, yeah. People are kinda like, why wouldn’t you guys have this or do it right away? So, Tell us a little about, like when we say, can we do this, what goes off in your head? You’re like, oh, there’s 10,000 other things below that thing to accomplish that other thing. And like, I think that’s always, so I wanna preface my answer to saying that we, we take all requests seriously, whether they come directly from you guys or directly from the customer.

We do look and vet at everything. We have a giant list or backlog of different. Things that we want to do and things that have been been requested. So they, they don’t go unnoticed. We do, we do look at everything. I think the part about, oh, is it like, why isn’t this feature just there? I have a unique perspective on this because I was in customer support before I became a software engineer, and I had that same thought.

I was always like, Hey, why don’t they just put this button over here? Why don’t they just do this or that? And then when I got into the engineering world and I started. Actually writing code and sitting in on, on plannings and meetings, it becomes very obvious that the bulk of work that goes into creating a feature.

Has to deal with not just the feature itself, but all the side effects that could occur from putting in a feature, all the testing that needs to go into place, all of the planning and the thought about these different contingencies. Well, what if this happens or that happens? Can we handle load of the new feature?

It might drive 10 x traffic or something crazy like that. I mean, even with the AI stuff right now. We’re trying to figure out what is the best way for us to queue up all of these different requests that come through the website, and then we have to generate quizzes. We wanna make sure our system can handle that.

And even if we’re not building those specific features right now, we still have to plan for those features to be built later on in the future. So again, what seems like a very simple request. May I actually take a lot of time and planning, and I think Jared, on our team who’s, who’s moving over into an engineering role and a product role after being customer support, like he’s seeing that when I sit down with him and we work through coding exercises and different projects that he’s doing, he always tells me, man, there’s, there’s a lot more that goes into this than I ever thought.

Yeah, it, it is crazy. Like in my head I’m like, yeah, just move that button, but then I see the code for that button, or like what that button’s supposed to do and where it leads to and blah, blah blah. It’s so much more complex. It’s not just like, yeah, it’s fine. We can do it in a heartbeat. Like Yeah. I think you also like pointed out too that you know, For probably, this mostly comes from Josh because, you know, c e o, but like, you have to think of it, of like, who’s, how are, how can we help the most people at this time?

And it’s like, you know, if you’re, if you have like a feature request that seems simple but takes a lot of planning, as you just said, like that might help like a certain percentage. But you know, something like Interact AI took precedent because it was gonna help a bulk. Of the customers that are either coming through existing, maybe wanna come back but never quite got their quiz right and just like, you know, didn’t have the time or you know what have you.

But now they can actually get that, all that stuff done, you know, and we’re gonna help the most people at one time. Yeah, yeah. There is a bit of intuition that’s involved with it. Like again, we have a huge backlog of requests, both internal and external requests and, and we sort of, Use our intuition and, and you know, for me specifically, I rely a lot on Josh and Matt because they’ve been in this business a heck of a lot longer than I have.

Job first. I’m not gonna get into like the crazy details of it, but basically you’re trying to understand what is the cost of delay of putting a feature on the back burner, right? So you’re sort of, you’re not looking necessarily at what are we going to get out of this, doing it right now, but rather the, the inverse of that, what, you know, how much is it gonna cost business if we delay this?

When I say cost

costless in terms of potential revenue that we’ve been missing out on, but more importantly, we also look at things like how is it going to affect the customer? Going to enable new business opportunities for us. So right now we’re redesigning the UI editor too. That’s the other major project. That’s something, you know, I’m guessing our customers.

It’s going to have the same exact functionality as the current editor, but from a business standpoint, it’s going to enable us to do things that we’ve never been able to do before because the whole code base will be updated. It’ll be much quicker to add new features, remove features, change things up, and so there is sort of a.

A delay in some of these other things that we want to do, but we know that once we get this done, we’ll be able to pump out new features way, way, way faster than than we ever have been. Right? So we’re looking at all of those different factors, not just, not just, you know, how much money are we gonna make off of it?

That’s probably one of the least of our concerns, I’d say. Mm-hmm. When we’re thinking about the product side of things, not to say we don’t, we need to make money, obviously, but there’s a lot of things we look at. I kinda have an interesting question. Do you think, cuz you know there are companies that are constantly releasing new features, but then I go to use them or try them out and, oh my God, they’re buggy, you know, or they don’t work.

What do you, what is your stance? Is it better to release a feature and not be 100% positive that you know it’s good to go and foolproof? Or is it better to wait and release it when it’s like more perfected? So that really depends, and this is my, this is my opinion, but I think that depends on where you are as a business.

If you’re Google, right, and your, your new feature is touching, you know, hundreds of thousands, millions of people, it’s a little bit different. I also think that it, it, it also depends on the feature that you’re building, the product that you’re building. So imagine you have, you’re building a financial product that deals with money transactions from one bank to another and one institution to another.

You need to make sure that that’s correct before you release it so you’re not just sending people’s funds willy-nilly all over the world. Right. Versus like what we’re doing with the AI stuff, we have the capability of going and updating the quiz. Mm-hmm. Or going and, and, and changing up the prompts a little bit to fi fine tune that quiz that’s generated.

And so for us, right, with the ai, we’ve looked at what we need for an mvp and we know that it doesn’t have every single feature that we want it to have or that there might be a small bug here and there. We’re okay with that. We, at least with this project and where we’re at as a company, we would rather rapidly iterate and have a really strong, fast, quick feedback loop from our customers and from you guys.

Then delay, delay, delay, delay. And then again, that goes back to that conversation that we had as a team when we were planning out this project was how important is it to get it out there immediately versus. At the other end of the spectrum, we can wait five years before we do it. And so we, we, we had all of these conversations during the product meeting, so that’s where we, where we stand with the AI type stuff.

Biting stuff. Does, did that answer your question? Yeah, no, that, that 100% answered it, it totally makes sense that like someone like Google, if they’re releasing something like that, they need it to be foolproof. But like even like I was just thinking about like, Venmo used to be great and then I think, did PayPal buy Venmo or something happen like that?

And. Ever since then, Venmo sometimes freezes on me when I’m sending someone money and I’m like, did it send, did it not send? Like now I need to close the app, open it up again. And I always panic because I’m like, this is your one purpose and you are buggy and it’s making you really nervous. So, yeah, I mean, I’ve worked at bigger companies as well.

I’m a startup person through and through. But I have worked at some larger companies and. They just operate differently. Yeah. And I’m not saying, I’m not trying to bash them at all. There’s really great things about working at some of these larger companies. But there’s some other bureaucratic type things that can get in the way of.

Rapid development or a certain feature that might be nice to put in that you know your customers want, but for whatever reason, right. It doesn’t, it doesn’t get through the product spec. And that can be very, very frustrating for the people that work there and obviously for the people that use it as well.

And every company’s just gonna be different in terms of how they handle that. That is the nice thing about interactive. We’re a small, close-knit team. We’re constantly in communication and we don’t have any investors or anything like that, so we can kind of. We can kind of test things out and, and do what we wanna do.

I love that we’re able to talk to like you guys and our engineering team so freely. Like I’ve been at other companies and the engineers are always like kind of. I felt like they were like behind like a cement wall and like you had to like somehow get through to them, whether it was like picking through the cement or like trying to get, like, it was really hard to, you know, even like get in contact with them.

But you guys are also like easy to talk to and like, you guys help us with things and you help us, like, you explain things to us when we’re like, why can’t we do this or this? And it’s just, I feel like it makes our team so much better having that. That’s something that Matt and I, so I was, I don’t wanna say I was the first engineer that was hired under Matt.

I think you guys might have had some other people for me that had left. But I think as we sort of discussed what we envisioned the engineering and the product team to look like going, going forward and in the future, we, we really wanna hire people that,

that are really strong communicators and that have a business. Frame of mind as well. Because at the end of the day, I think, I think Matt and I would both agree that this isn’t a side project. This isn’t a school project. This is, there’s a business behind every keystroke and every line of code that, that we write.

And so we can’t just be so heads down on writing the code that we forget about all of the interactions that you guys have with customers. We need to be able to put ourselves in the shoes of customers and. This is the other thing that, that we’re pushing for, and I think I sat in on, on one client call, I forget who it was with, but I think, you know, engineers should be sitting in on client calls, like just be a fly on the wall and just sort of observe and see how customers are, are, are interacting with the tool, the questions that they’re asking you guys, and then synthesize all of that information and, and try to understand how you can build that into the product in a very seamless and easy user-friendly way for them.

Yeah, I love it. And we, and we have like a good connection. You guys aren’t so far connected from our customers and like you understand what you’re writing your code for and how it’s being used. Yeah, it’s pretty cool cuz as a fully remote company, it’s not like we can just go have those water cooler conversations and yet, you know, out of all the teams that I’ve worked for, I mean, We use Slack heavily here, right?

We post a lot of questions that are publicly, so even if it’s not over zoom call or slack huddle, I can see what’s going on in different Slack channels that we have. And then obviously I have always felt empowered to reach out to you guys and ask questions whenever, you know, something pops up in my mind, I’ve never felt like there was this, this gated wall that I can’t get through.

Yeah, I love that. I agree. I agree. So, To close this out unless you guys have any other questions. I was gonna ask what’s next for Interact ai? So currently what we’re working on is, is backend features that aren’t directly customer facing, but what we’d like to do is to start building out the system where we can programmatically take the responses from the questions.

Feed them into the AI system, have it all done behind the scenes, and then eventually what’ll happen is you guys will get notified that, hey AI quiz has been generated. Again, going back to the idea that it’s not just a machine that’s building the quiz for the customer. Like you guys will have input on it.

You guys will be able to look it over before we pass it onto the customer. Having that process automated. Is, is the next step. And then dog feeding it into our own team just to make sure that we can work out any major kinks or bugs or issues that that occur before we release it to, to the customer.

And again, going back to one of the questions I think you had Jesy, about like, does it need to be perfect? It. There’s a spectrum and, and we, that’s where we landed on the spectrum of like, yeah, we’re just not gonna just release it and then like, it’s just gonna throw out crap quizzes to people. Right.

Like, we wanna vet this out internally with the team first and see how it goes before we release it to, to our customers. Yay. It’s exciting. Beautiful, beautiful. Well, Graham, thank you so much for joining us. You will be with us once a month answering engineering questions, right? Of course, it’s a pleasure.

I just corded him, everyone. But yes, what am I gonna say? But we do wanna have you know, closer communication with the customers of like what’s going on behind the scenes at Interact, how things are being built and whatnot. But we love having you on here, Graham. So that’ll be really fun and interesting.

And for those who are listening, if you have any questions about product let us know. We can. Answer it on a podcast, hopefully with Graham on here. And as always, you can fill out the form for Interact ai, which we will link for you in the show notes. You’ll get in touch with someone from the teams that we can get your quiz to you within the next 48 hours.

And that’s all I got. We’ll see you guys next time. Bye bye.

Make Your Own Quiz For Free
Jessmyn Solana

Jessmyn Solana is the Partner Program Manager of Interact, a place for creating beautiful and engaging quizzes that generate email leads. Outside of Interact Jessmyn loves binge watching thriller and sci-fi shows, cuddling with her fluffy dog, and traveling to places she's never been before.