EP 64: Joshua Johnston from Capsher Technology
0:00 All right. Well, welcome back to another episode of Energy
0:04 Bytes. I'm Bobby Nielen, John Calfan and Todd Bush are in Vegas for Quorum's conference today. So it's my first time running the show by myself, but excited to have Josh Johnston here from Capture
0:17 Technology. Good to be here. Thanks. Yeah. You're the project consultant and implementation lead at Capture. Yep. So senior dev product consultants, kind of like a architectproduct
0:28 designproject manager that you might see at other larger corporations. Yeah. Very cool.
0:37 So yeah, let's just dive right in, Capture as a whole, like what do you guys do? And especially, you know, seeing as we're energy bites, you know, especially how does that, you know, play in
0:46 with energy, oil and gas and different verticals? Yeah, of course. So Capture is a software consulting firm, right? We write a lot of software, but not for ourselves. Sure. So our company
0:58 started at Texas AM, Our owners both worked in the petroleum engineering department and they You know, we're writing software for some some of the the big ONG players Okay, and they decided that
1:12 rather than work through these right big firm. They wanted to Create a company that they wanted to work at so they kind of set off on their own 30 years ago I this year in fact was our 30 year
1:24 anniversary. Yeah, and So a lot of a lot of software projects and oil and gas
1:33 You know all the different strata from planning to fracking to drilling to Casing running yeah completions post completions every 30 years you've seen it all. Yeah, the technologies including
1:46 fracking.
1:49 Yeah Or horizontal fracking rather fracking around forever. Yeah, everything a lot of a lot of everything vertically right and And then, especially with when COVID hits and ONG really took that big
2:04 hit when the price inverted, capture started trying to divest and diversify. So we've actually had a lot of opportunity to do some projects in a lot of other domains, fast food, car washes. So
2:20 it's really all the same when you get down to it, software, software, but a lot of interesting problems. It's fun to see them here to there. Yeah, absolutely, yeah. It's all the same thing But
2:30 certainly definitely a lot of experience in ONG energy. That's kind of our bread and butter. Yeah, no, I mean, it sounded like the origin story. Yeah. It was there and you're in Texas and. So
2:44 I think kind of an interesting thing to kind of get hit on there and you've been with them for what, 15 years? 15 years, 15 years now. So you've seen a lot, but even like maybe you can speak even
2:55 to some of what's happened before you got there, But -
2:59 I mean, software development has changed radically in 30 years. And even in 15 years, much less,
3:05 you know, 30 years. Yeah, for sure. Curious, you know, how the projects have changed and how they're the same, too. I think the more it changes, the more it's the same. But yeah, it's funny,
3:16 right? It kind of that old aphorism, the more things change, the more they stay the same. Yeah. But certainly, right. Starting 15 years ago, it was everything was C and desktop apps, right?
3:28 The web was still very, I mean, 2010, so - Yeah, you were starting to get some cloud computing, and yeah, it wasn't what it is. Right, web apps were definitely a thing, but not, yeah, like
3:41 cloud environments were still a pie in the sky.
3:47 But yeah, a lot of C, a lot of low-level desktop apps, yeah, a little bit of Java here and there So certainly, you know. Compared to then what you see now, you've got obviously the. NET stack
3:59 is just huge. Yeah. For as much flak as Microsoft, it got 10, 15 years ago, they really did a good job pivoting and kind of becoming the dominant platform, especially cloud computing. Yeah,
4:14 especially on the south with oil and gas, I mean, a lot of people leave. Yeah, yeah. And that's out of it for good reason. Yeah, absolutely. But then of course, you know, ONG, everybody
4:23 loves their Python And Excel worksheets, so love 'em or hate 'em. You gotta know how to do all that stuff, yeah. Yeah, I'm big on it. You gotta meet people where they're at. Exactly. It's just,
4:37 you can give them the best thing in your mind, or maybe it is, but if they're not ready for it, then it doesn't matter. Yeah, and then, you know, everybody still has their legacy systems, so
4:50 there are certain projects where we're still looking at mainframes, right? these big legacy systems where. they're, it would cost them decades and, and million, tens of millions of dollars to
5:01 fully transition. Yeah, no, there's no benefit really. Real business case. Yeah, exactly.
5:07 No, that's cool.
5:10 So I guess looking at, I mean, like, you know, so where are you guys most focused today? Or like, where are some like, you know, are you starting to get a lot of AI, you know, you know,
5:20 kind of questions from customers? Like, hey, how can we integrate AI or? Yeah, again, for better or for worse, right? A lot of, especially a lot of the upper management, CAI
5:33 is a panacea, right, a silver bullet. Yeah. Like, hey, can we jump on this AI train and cut costs by whatever percent? Yeah. Certainly there's some stuff that AI is very good at. Yeah. And
5:51 if you're harnessing it for those things, in yeah right it. it can certainly pay out.
6:01 But
6:03 it is a specialized tool for a specialized job. It's not gonna make all your problems go away. In fact, it's gonna introduce a whole lot of problems 'cause now you gotta worry about where did my
6:14 training data come from? Sure. Am I having a public model or it's gonna be in general? Exactly, is it gonna leak customer data back to open AI or whatever, so. So, I mean, maybe if we dive
6:28 into that, so I mean, you said there's some things that, you're saying that it does really well. What would you say some things that you've seen, you just in practice and or things that you guys
6:35 have implemented that work really well? Yeah, so, you know, we've got a few projects, a few customers using for a few different things, but overall kind of the things it's good at is
6:49 consuming a whole, just a boatload of data, right? a boatload of data, right? Mm. and exactly what it's built to do, finding patterns and associations and hey, generally when you write in a
7:04 document and you got these big companies with corpuses of tens of thousands of white papers that they've written over decades, right? Over topic A or topic B. Hey, generally when they're writing
7:18 about topic A, they talk about these things, right? And when they're talking about topic B, they're talking about these things And so AI is great when, you know, helping you find those
7:26 associations. So hey,
7:33 I need reservoir of performance or fracking or, hey, I'm seeing such and such happen on my well production. And then it can go and find all those white papers that might be associated with that
7:47 rather than you haven't do SIFT through, you know, hundreds of papers on your own. So it can help shortcut that stuff, generally probably not gonna give you the super accurate answers magically.
7:59 No, for sure. And I know even like the folks here, like, well, you know, really almost formerly DigiWell, kind of just calling it Clyde now, and the application is Clyde. You know, they've
8:09 got their AI platform here, but I think a big thing with that, especially when it's something you're doing internally is that you can build these models, but now it can reference where it came from
8:18 as well. So, you know, it can give you a summarized answer, but it can also point you to the document so you can more of a trust but verify thing, whereas, yeah, I think the struggle with a lot
8:27 of these public models is just like, they're gonna, they're very confident and they're gonna give you an answer. And if you, and much like people had to be smart about, even when they googled
8:36 things, just don't believe everything you googled, but now it's like, this thing will give you a very confident answer, and you could run with it, but you really gotta be careful. Yeah, for
8:44 sure, right? It's like the AI images where, hey, do people really have six fingers? Well, no, right? Looks legit, yeah. Yeah, so I mean, text AI is no different, right? It can be really
8:58 good at getting you 90 of the way there, but those final little steps, yeah, certainly trust but verify. Yeah, so like some of these, like if you've done, you all have done some of these
9:07 implementations, like are these like rag model type things? Yeah, some different rag models, you know, mixtures of LLMs, so have an LL, so
9:21 specializing your LLMs, right? You're not just gonna have one that kind of gives you everything I want once, so have one that's good at, you know, maybe searching for the information, have
9:33 another that's kind of specialized in, okay, given these, you know, search hits, helping you narrow it down and you see the same thing with image generation, right? you've got kind of your, Um,
9:51 you're
9:54 the parts of the model that are good at, hey, I want a Darth Vader, picture of an Darth Vader writing unicorn. Well, we'll find you some Darth Vader's in unicorns, right? Sure. Then you've got
10:02 another aspect that kind of puts that stuff together and makes it look like one in each. Yeah, yeah. And I guess like more on like the kind of actual technology beneath that side. I mean, like,
10:14 are you guys just like, is there, 'cause no, there's no big platforms. You can use price stuff, I don't know, I'm sure Azure, TCP, AWS have like, you know, some platforms that you could
10:22 probably stand some of this up on, but then also, you know, there's different vector databases and definitely the slice that we're gonna build versus buy or whatever. Right. So what are you guys
10:31 seeing or like, is there anything that you've leveraged that you guys particularly like or?
10:37 It really depends on the client's needs. Yeah. So certainly, you know, some of our customers are, hey, let's just utilize OpenAI's kind of model that they've got built because. we just want a
10:53 kind of a more general English, natural language processing kind of aspect to it, a chatbot, right? Yeah. And so we'll help, we'll take that. The majority of it's gonna be like an on-prem
11:08 solution, you know, so rather than relying on open AI servers, they'll pull it in that way, they're not leaking data back out. Yeah But then
11:21 putting, you know, custom training or different customizations on top of those general models for whatever for purpose needs they have, right? But then we'll have other customers who, you know,
11:33 I don't care about, you know, what has been trained on the internet 'cause I'm not gonna go ask it questions about Langston Hughes, right? I've got this corpus of white papers on all these
11:46 different topics and that's what I wanna train it on I want it to be super specialized, engineering savvy.
11:53 and then, right, so training up those models. And then a lot of it is going to be, in terms of like hosting solutions, a lot of cloud stuff, right? 'Cause especially when it's time to train a
12:07 model, obviously there's a lot of computing power, you wanna be able to scale that way up and just crunch through gigabytes of documents, right? And then scale it down when you're done Sure, but
12:20 even then executing those models can be computationally intensive. So, yeah, those scalable infrastructures, VMs, images, app services, whatever solution that Azure,
12:36 AWS, all those other cloud providers will give you, just depends on, does the customer have an existing relationship with whatever cloud provider. Yeah, it certainly, yeah. That makes sense.
12:52 Basically, I'm thinking too with a lot of AI stuff is you've got to build modularly because it's moving so fast right now, like you could pick the best solution today and a week, much less, you
13:02 know, two, three months, like it could be out of date, you know, like, and you need to be able to rip and replace where possible. Yeah, and yeah, absolutely, you know, a thousand percent
13:11 with AI, it is such a quick changing landscape, but really with any software, you want to be right, you don't want a tight coupling to any specific solution You want your nice clean layers there.
13:24 Yeah, if you want to swap out a model or, hey, LLM 20 or RAG or whatever, right, whatever the researchers come up with in six months. Yeah, right, that's going to be the newest, biggest,
13:38 best thing, so hop on it. There you go. Yeah,
13:42 sweet
13:45 So I mean, I don't focus a lot on, you know, that made me that kind of way, kind of away from the AI for a little bit, but like. be back towards like some like the various solutions. So I mean,
13:54 I kind of keep in like energy specific. I mean, can you speak like high level? Some, you know, whether it's recent wins or software, you guys have implemented or even just, you know, one of
14:03 your favorite projects over the last 15 years and just kind of like, maybe diving up what that looked like and maybe some of the design choices and how it all worked. Yeah. So we've being a
14:14 consulting company, we've been really fortunate that we get to do a lot of cool projects. So some
14:25 of the ones that I've been lucky enough to work on that we're really cool, they've probably been ones
14:36 where we've taken like high resolution data at 20 hertz, 40 hertz for, you know, whether it's a fracking operation. Yeah, it's in like the water here. Yeah, exactly. Or even drilling, right?
14:53 Mod pulse decoding, that type of stuff.
14:58 Taking all that data, being able to do stuff out at the edge, right? On rig or in the field, at the job site.
15:13 Having that capability of providing some of that real time, regardless of if your satellite connection is up or down, being able to make decisions. But then shipping that data up to the central
15:26 server and helping to make sure that, you know, crew A, when they are making decisions, that information goes to night shift, right? Yeah, 100. Yeah. 'Cause it was, it's so easy without that
15:40 connectedness, you know, they throw a clipboard of notes in the truck, and then the truck drives offs and clipboard falls behind the seat. Well, you see it with, you know, again, with
15:49 different freckers and link crews, like, oh, this, you know, these guys are, you know, this company man or ever is ramping up, you know, the freck harder and there's other guys a little, you
15:57 know, more conservative. Yeah. And now you can have, you know, five stages from, you know, the morning, you know, that are operated totally differently than absolutely the ones in the
16:07 afternoon. But, but even like, I know, John and I, we've always been big on that connectivity locally because John and I worked at a company reservoir data systems for a while. Yeah. It was
16:16 kind of more on the frackside and we streamed pressure gauges and sensitive, but like, connectivity is not a guarantee, even in this age. Absolutely not. And you can't, you know, while, I love
16:26 the cloud and there's a lot of things that's really good for like, hey, there's some latency getting the data from there, even if you are connected, for someone else can even see it, which again,
16:34 five, 10 seconds is probably nothing in the world in most cases. Right. But you lose that connection and you have critical operations that, you know, are worth millions of dollars, but you have
16:43 even lives out there that are depending on the stuff that it needs to work there as well. Well, especially that stuff like pressure sensors, engages and stuff, right? You can't have a pressure
16:53 spike or, right, you see a kickback or right at OPSD, you have to be able to shut things down immediately, right? Or you're, like you said, you're not only risking millions of dollars of
17:06 equipment, right? If you blow a fluid in, but like you said, lives. Yeah. So. Yeah, and that's, and I think that's where we, you know, we've discussed it before too, but like, just, you
17:17 know, a lot of people want to think that, say, oil and gas is slow to adopt technology. And I think it is in some ways, but the other, at the other side is just they have to be, you know, in
17:26 some ways, because like, you know, like in a broken record, probably on this, but it's like, you know, Spotify can be wrong on their algorithm. You just flip to the next song. Yeah. But if a,
17:35 you know, machine learning algorithm is wrong, in the field and someone just follows it blindly, they could, you know, again, lead those serious consequences. Yeah, I mean, there's a reason
17:44 why you do data acquisition at the field, you're still dealing with OPC and WITS and even Canvas, right? But 20, 30, even 40 year old technologies, well, it's 'cause they work. Yeah. So it's,
17:57 yeah, it's just not worth the risk of trying to overhaul everything retrofit all your equipment, but then, yeah, you have that risk of not only NPT, but something goes down, you lose your sensor
18:10 readings. Yeah. Yeah, you can really mess things up really quick. For sure. So you can pick my interest to start talking about the 20 Hertz data. We did actually a good bit of data collection on
18:21 some high resolution stuff at that time too, but I mean, like, curious if you can talk high level about making some of the data acquisition, just the IOT devices and yeah, but then even like, is
18:33 there, are there certain like, is it lower level program like? programming landers you're needing to use, like C obviously has been around forever, but Rust has kind of come up so just curious on
18:44 your thoughts in that. Yeah, so
18:48 definitely for as much as all the college grads, don't really hear about it these days. It doesn't seem like C is still definitely out there. Yeah. Rust, we haven't done commercially with too
19:02 many customers
19:07 But PLC programming, so not quite C, but
19:14 whether it's a back-off or a 3XL, right? Dealing with the low-level byte manipulation
19:23 definitely is still very big. And especially anytime you're out towards the edge and integrating with those sensors, Sure, there might be a nice, even a. NET driver for it, but at the end of the
19:40 day, you still have to know how to manipulate bytes and ship them around and things don't work perfectly. You're gonna have to debug at the byte driver level. So it always kind of comes back to C,
19:56 even if it's not specifically that language, right? Those fundamentals, you need to be able to understand low-level data structures, low-level network architectures, right?
20:09 Especially for the debugging aspect of writing those systems 'cause if something starts to be slow, right? If you have to hit 20 hertz, 40 hertz, even 100 hertz, to make those quick decisions and
20:24 stuff isn't coming in that fast. You have to be able to figure out where's the slowdown? Yeah.
20:33 make those low-level programming decisions about, hey, what do we need to do to be able to just acquire data this fast? 'Cause you start talking about stuff coming in that fast, it's no longer
20:45 trivial, but it's no longer for free. Yeah, well that, and then having the storage and compute downstream of it, even on site, I mean, maybe you can speak to that. I mean, just, I guess it's
20:57 the
20:60 law that
21:03 compute will double every year or whatever that, I forget what that law is, law, yeah. But I mean, I think we're seeing that a ton, like say on the edge, you know, Johnny's work at an edge
21:12 computing company, Hivesell, which they've kind of, since it didn't really take off, but it was a great concept of being like stack up, like basically servers like on top of each other, like
21:22 little Kubernetes clusters on the edge, but just what's been enabled by that, you know, like what you can do on the edge now versus what you could do even 10 years ago We'll eat then, you look
21:32 good. containerization technologies, right? So the same containers, whether it's Docker or whatever, that you are throwing out in Azure or AWS and saying, Hey, Azure, spin up 45 of these for
21:45 me, right? The same patterns, the same technologies. Yeah, absolutely, you're running now at the edge. So a lot of our projects will write one app that's designed and architected to be
21:60 containerized, we'll create a Docker image, say, and then throw it on the cloud for the server and then they need that same functionality at the edge. Well, cool. Just throw it right now. Throw
22:12 it on Docker, use it anywhere. Yeah, absolutely, 100, so. Yeah, no, that is super cool. One thing I'm going to pick your brand, 'cause I'm well aware of Docker containers I've used them,
22:24 but what are some other container technologies out there? And is there a distinct difference in - Actually, I'm going to have to say, you know, I don't know on this, because I feel like there are,
22:35 but I mean, I don't know anyone that uses them. Yeah, there's got to be, if there's not, it seems like there's a huge opportunity there too, but yeah, Docker's mainly the one we use,
22:48 definitely in terms of like the hosting solutions, there's either Docker or Rancher desktop, which is kind of
22:56 the free and open source version of that. Yeah Yeah, no, it's a, even that has been a really cool, just even for local development and - Oh yeah, it makes things super easy. It's so hard to,
23:11 cloud services are great for like production and running. It is so hard to develop against though, right? 'Cause it's difficult to emulate that infrastructure locally So certainly, you know, you
23:28 have to use container technologies. to be able to write and debug. Yeah, and even now with some of the more serverless type things like, I mean, 'cause they can basically host your container
23:39 services and now you don't even have to worry about Kubernetes 'cause I don't know how much you guys have done deploying your Kubernetes. I haven't, mainly because I've heard people say don't. Yeah,
23:49 'cause it's a monster, yeah, absolutely definitely was I But. a
23:55 game changer, 'cause I mean, even a lot of things, these things that people aren't necessarily containerizing themselves, whether it's a Lambda function or whatever it is, I'm pretty sure using
24:02 containers in front of the hood to scale out and do all those things. Yeah, Lambdas or Azure functions, Azure functions, app service plan, right? It's so just the
24:12 smart way to do your dotnet stack for your web apps. Yeah, I mean,
24:19 they've all gotta be some sort of VM or something out of the hood. Yeah, yeah, of course, it's just something, yeah. of someone else's server. Yeah, exactly. And then you just have that whole
24:29 shared responsibility model where like, you know, as you move the more serverless things, you're removing how much that you need to manage yourself. Right. And letting them handle even more of it.
24:38 Which, you know, kind of one of those things double edged sword, it's great, especially with the zero trust models. Yes. Right. It makes authentication a breeze because I let somebody else
24:51 handle OAuth and I just implement the interface and I'm done, right? Whether it's BBC in Azure or AWS
25:01 Cognito, right? Yeah. Let them deal with that. For sure. And handles all your SSO and stuff. But at the same time, if something's not acting, right? It is a bare to debug. Yeah. Try to pull
25:11 back all the layers and but. Yeah. But in most cases, it's probably not worth your time to hassle with that. Absolutely not. Yeah. Definitely worth the cost. Yeah, for sure. So I guess then
25:24 we're talking about you know, maybe moving downstream of like, so you got your, you know, 20 Hertz type stuff and some PLCs and then you got a, you know, a robust edge, you know, device,
25:35 whatever. You talk about getting data than in the cloud.
25:39 You know, what are some technologies, you know, being utilized there? I mean, are you sending aggregates up at the cloud? I mean, 'cause like, oftentimes it doesn't make sense to load probably
25:48 20 Hertz data, you know, directly to the cloud, but, you know, at least via that connection, maybe it's sending aggregates and then you can send like a dump of it after the job, up to the cloud
25:58 to get the granularity up there, but just curious what that looks like. Yeah, so
26:04 just to name drop here, because it has, of all the technologies I've worked with, time-scaled DB has been - It's great, you're like the
26:12 second person in two or four episodes that - It's just a godsend, I mean, back before, man, I'm trying to think of the one
26:25 time scale influx. Yeah. So back on SQL Server, right, trying to put time series data in there, and it's just going from, you know, 20 seconds for a large query. And then, oh, hey, yeah,
26:43 influx. We'll try that out. Yeah. There's only taking 20 milliseconds. What? Yeah So just the insane, just like time series databases in general. Yeah. But I mean, I mean, I mean, it's
26:58 funny. I've been going back and forth with this guy every now and then I'm on LinkedIn because I he'll post it and then I'm like, ah, just use Postgres. Yeah, because but literally Postgres has
27:06 something for everyone. And now you get time scale Postgres. And it's a Postgres database. It is. And you just put the extension and now you can ingest insanely fast You can run queries across
27:16 your time series. But still it's not free because like you said, even if you're only running at one Hertz. If you're acquiring one Hertz data for 20, 40 frack pumps on-site for a frack job, right?
27:32 Yeah, you're looking at gigabytes of data an hour, and so now you have to deal with cool gigabytes an hour. I do a 40 zone well, 40 stage well. Now I've got tens, if not hundreds of gigabytes,
27:50 and now I do a pad, right? So, yeah. Compression, aggregates,
27:58 the sheer volume of data is pretty staggering sometimes. And so, yeah, like you said, a lot of times you're not going to ship up 40 Hertz. Yeah, through your 5G connection. If you're lucky to
28:12 have 5G in there Starlink makes a little bit better, but sure. If you have Starlink, right?
28:20 But yeah, one Hertz, can probably do pretty reliably. But then, yeah, because you can send MQTT pretty quickly. Yeah, if the customer insists, no, we need this 40 hertz data. OK, well,
28:35 we'll ship it up as fast as we can. So strategies there, hopefully it zips nicely, right? And you can compress it or something just to help get it up. But generally, as long as you've got your
28:51 edge resources, where you can deal with that data when you need it, getting it up
28:59 an hour or even 12 hours later is usually OK. They just want a data warehouse so they can come back later and do whatever analytics and data science.
29:10 Or, god forbid, something goes wrong. They can, you know, mine it and figure out what happened and
29:18 where Nice.
29:21 So maybe if those ships kind of maybe. you know, down from, you know, more back-end stuff to like, what are you guys seeing like on the front end? I mean, I mean, I know React has made a huge
29:29 push in the last 10 plus years, but I mean, but it seems like you guys are using a bit of dot net. So I don't know if there's a like a blazer or any of that kind of stuff or. And then not so much.
29:43 We haven't done a lot of the, you know, razor blazer technology stuff It's mostly been, you know, the rest APIs with serving up JSON,
29:56 especially through SignalR, right? So that's going to be like your web socket technology. So you've got your, your web page that
30:04 can act like it's, you know, getting data in real time from the server. So I do something over here and then it, it in real time that change shows up over here So definitely signal our web socket
30:19 technologies are big. back in the day, it was
30:24 Ajax and making those asynchronous calls manually,
30:29 but rest API still huge, obviously, getting a good way of doing that, or it seemed like it had a lot of promise at one point, and I just feel like it's kind of petered a little bit. Yeah, no,
30:40 we haven't done too much. So it kind of like go, I think it was, this is going to be the next thing. That's fine, yeah. Yeah, no. Rest
30:52 APIs, they're pretty tried. I mean, I think we talked about this a
30:54 lot, you know, because I lean more on the data analytics side. And it's like, it's great if you have a REST API, but for me to pull a lot of data out of REST API, most of them are not really
31:02 well written, and they're not great for moving large amounts of data, like into like a warehouse, but if it's the only option I got, it's fine. But building for a specific application, I mean,
31:12 it's, I think it's still the best way and definitely beats soap. Yeah, yeah, it's like trying to move a bunch of data over with some L, and so, yeah, it's like, oh my goodness, like how much
31:22 overhead do you need to add to just give me, right? So certainly don't take a hammer and try to screw in a screw with a hammer, right? You use rest APIs, what they're good for, and if you need a
31:37 way to stream gigabytes a day that will use something else, right? You've got ETL tools that exist for that or - For sure Use those, so I mean, I think we talked a lot about maybe some of those
31:48 kind of edge to
31:50 consumer kind of products, but I imagine you guys get all the different kind of stuff and any other kind of projects that you guys see a good bit of or any certain trends, you know, outside of AI
31:59 stuff that you're getting a lot of requests from customers that are kind of similar. A lot of still mobile apps, right? Like, hey, I've got an engineering app, app that runs on. an old Excel
32:16 file or you don't hear that too much these days, but that's what it used to be. Sure. Or I've got a
32:28 web server that the web page isn't responsive and the C-level people, they want their dashboard and their KPIs and their analytics on their phone. Yeah. I want the company man wants to be able to
32:42 pull out his phone and monitor the status of the current set of perfs
33:02 or a frack and be able to know, okay, yep, they're hitting my target volumes. They're gonna be down to target depth in another hour. So I have time to go, kick off whatever maintenance or
33:03 whatever we need to do. Sure. So definitely a lot of, Bye!
33:11 Getting the right data, the right people, and the right format, whether it's a mobile app or a specialized kind of web app. You
33:22 know, that will much like say in the more traditional BI tool space, or you'll try to say more like, you know, custom software. Yeah, generally not. What
33:33 we tell our customers is it's probably not worth your money to pay us to do that Fair, yeah. Right. We will set up your database for you. Or if you've got one already, we'll get the data there.
33:46 Yeah. And then we will do everything we can to make Power BI sit nicely on top of that. Whether it's Power BI or, you know, spot fire. Whatever the Jupyter. Yeah, yeah, exactly. I mean, at
33:59 that point, like it should be centralized there and then like, whatever you want to use Yeah, we'll make sure it gets there, whatever QOS you want to put on top of it, right? real time or within
34:10 the hour or 12 hours after the job or stage ends, we can hit that, but then yeah, put whatever you're not on top of it. So coming back to the mobile development side. So I feel like, for a while
34:26 there was a trend of, I think there were different libraries or ways of doing it, whether it's a progressive web app or it was Xamarin. Xamarin. But like we'll let you write once and then deploy
34:37 to Android or Apple, but I feel like now the trend has been away from that again and doing more like, building with two separate code bases, one for Android, one for Apple, just 'cause I think
34:48 it's a more tenable solution, but just curious what you guys have seen and what works. Yeah, I will confess I haven't done a mobile app probably in the last couple of years. The last one I did was,
35:04 I guess it was Dart Okay. So it built off of Go. kind of a Xamarin successor, right, but structure by Google meant to be able to cross-compile to either That's those technologies are great if you
35:20 need a kind of a basic app that You don't want to have to worry about the ins and outs of iOS versus Android. Yeah But again, it's it's a common framework. So if you need Some specialized thing or
35:36 you know, I need to do notifications in some weird way. Yeah, or then that's then you've How much is fighting that common framework to be able to find specialized ways to do things? Yeah, which
35:49 is it worth it now for sure or if you need custom right custom graphics, you know unity I guess would would provide a okay something like that would provide a good late library, but once you get
36:01 down to those lower layer type of function How much do you want to fight? those common libraries versus, I guess, is it cheaper just to find, we're just gonna go with or whatever iOS is now versus
36:17 Java for the Android stuff. Yeah, no, no, for sure. It's a, and I haven't really dealt too much into it, but I've been kind of adjacent to some of those conversations and I didn't know when I
36:28 was doing my coding bootcamp, which I believe it's probably at least eight years ago, but I mean, progressive web apps were becoming a thing. I was like, Oh, well, now you can write a web app
36:35 And it can also deploy as an app, but then, like I said, and I've seen it in a similar way with, say, low-code platforms, or even BI tools, which are kind of low-code, but it's like, they can
36:49 do a ton, but you get to these edge cases and you're like, I have to do some black magicto make it do what I want it to do.
36:57 I mean, it's kind of the same with even like. NET. I mean, it's, or like EF Core, right? Some of the solutions within. NET is, great at making things easy, but as soon as things aren't easy,
37:09 they are really not easy. Yeah. Yeah. And maybe this is probably a good way back into some of the AI stuff. So now we're seeing like, you know, all these different kinds of abstracting with AI
37:18 of like, I mean, there's a level world out there. I use it to build my personal website and my business website. And it was insane how fast was able to build it. But again, it's a more of a
37:29 static website type deal with a
37:33 little less complexity and man, it did a really great job.
37:38 But, you know, what are you guys seeing as far as like, because I feel like some of those were, you know, power users are going to be able to sit there and talk in natural language and develop
37:47 some of these kind of full fledged applications.
37:52 You know, what are you seeing as some of the benefits to that? But then also some of the gotchas that people aren't, you know, taking into account and then even maybe down, you know, lower level
38:00 than that. And now you have, like say, stuff like cursor or windsurf and some of these. you know, AI assisted coding where like, you know, still the coding and, but, you know, people are
38:11 still able to talk to it to help them write the code, but maybe not as, you know, off the shelf, say as a lovable and I guess even without people are actually using lovable, saving it to GitHub
38:20 and then plopping that into cursor to help
38:23 harden it, right? But just curious what you're seeing say maybe from a customer standpoint on that, but then even like maybe internally, what are you guys looking at and, you know, have you all
38:32 tried out some of like the technologies there? Yeah. So I'll talk internal first just because I think that's the first thing I thought of, right? Co-pilot with Visual Studio is awesome. Yeah.
38:48 All this AI stuff, it's good - I don't remember if we were recording when I said this earlier, but it's good for getting you 80, 90 there, right? All the code you're gonna write. you're gonna do
39:04 the same thing over and over again. Declare a variable, right? Connect to an API. Connect to an API for loop over this specific, yeah. So the tab completion, especially co-pilot inside of
39:18 Visual Studio when you're writing. NET, it's amazing how pretty darn good that is. Yeah. But you're only doing 10 different things over and over and over again, right? When you're writing code.
39:31 Sure.
39:34 So for those aspects, it's great. And I would say even the customer facing stuff that people are wanting to use AI for, it's really, really fantastic at getting you 80, 90 of the way there with
39:52 80 to 90 accuracy, right?
39:56 Trust but verify. So you'd always just have to make sure it. Those last 10 percent. Yeah, it can.
40:07 It can give you a pretty darn good website or write you a pretty darn good summary or spin up a boilerplate document. It's fantastic for those things 'cause there's only so many ways that you can say
40:17 the same thing. Yeah. Right.
40:20 But then look at it, make sure use your human brain, right? That's one thing that AI
40:27 doesn't have. Yeah. Well, I didn't mean even some of the domain knowledge. Right Some of these get more, you know, more narrowly trained and you could actually probably have like a oil and gas
40:38 trained software, you know, like kind of you, if you really got into it, but you know, it's not there yet. And even still, like it's going to be trained, there's so going to be bias into what
40:48 it's been provided. Right. Yeah, it's, yeah, obviously it is going to be bias to whatever it was trained on, or if it was trained on a lot of stuff, it's going to be, right, And it's kind of
41:00 a funny problem in AI.
41:02 machine learning, anything like that is
41:06 you don't train it enough and it's like talking to a toddler. You had trained it on enough and now everything is kind of just this smeared uniform shade of gray. You start to lose the interesting
41:22 features that exist in language and the way people have written stuff on purpose. Now maybe that doesn't matter for like a white paper where you want everything.
41:39 I was talking to people over the weekend and I think that a lot of that's still going to shine through. It might even shine through more in this day and age now. Like if you get where like a truly
41:48 transcendent,
41:51 whether it's songwriter or author or something, I feel like those types of things are going to shine through even more because like it's just just gonna be in the sea of AI-generated content. Right,
42:02 and I mean, people knock an AI for, oh, it's all this stuff is copycat. I mean, look at human history, though. Everything, there's nothing new underneath the sun, right? How many songwriters
42:17 for all of time have been just kind of formulating and all that stuff? And it's not - Sure, and I'm like - This one's like bigger authors. He has like a ton of ghost writers underneath him. He
42:28 barely writes any of his books at this point. Like,
42:32 in any copywriting has been a thing forever where I even like people are writing blogs, you know, and I think the customer's providing them. Here's the content, but someone else is writing the
42:40 copy whenever they get published. So I mean, like, it's a fine line, but at the same time, like he said, as you get into those more probably like human artistic, you know, domains, it's, you
42:51 know, I feel like some of that stuff's gonna shine through. Yeah, and certainly just the same as it does now, right? And in a sea of, Everything you hear on the radio is kind of one or two notes,
43:03 right? Those things that do stand out now, is AI going to get even better at that? Yes, probably, but
43:12 it's never going to be Skynet, right? It's always going to be until AI can assemble and recompile and retrain itself and build new hardware for itself all on its own, it's always going to be
43:27 dependent and just another algorithm, a fancy, very complicated algorithm, but it's just an algorithm. So, kind of going to another direction with AI, so we didn't really got into your
43:37 background, but I see on LinkedIn that you've basically got a PhD in AI and machine learning. That's right, yeah. So, is this implementation of AI in the last, let's call it, two, three, four
43:50 years now since 2021, is this what you kind of envisioned that it would be once it finally took off? Or is this. you know, or is it exceeded your expectations? Is it, you know, just curious,
44:02 you know, 'cause I mean, when you were doing it back then I'm, you know, it was probably a different construct or only so much you could do with the constraints. Yeah, definitely, definitely a
44:13 different beast. Certainly, right, it's exceeded.
44:18 If it hadn't, right, I'm not some sort of amazing visionary. No, for sure. But I mean, people have done science fiction forever and have this like idea of autonomous robots and these things can
44:29 do, but just - Yeah, yeah, yeah. No, definitely just the absolute scale of LLMs and kind of that cutting edge is kind of mind boggling. Just how huge they are. Now, back in 2010,
44:51 when I was doing my PhD work
44:53 was certainly large neural networks And that type of stuff. but yeah, just the absolute scale of the amount of data that's being consumed, right? Literally, you just put it out on the internet
45:10 and say, have fun. Yeah. Consume
45:14 is pretty amazing and pretty neat that we can accurately
45:21 consume all that stuff and build these models. No, it is. It's insane Yeah, and so, you know, my research was looking at mixtures of experts. So that's kind of what you see now with some of the
45:35 newer LLM. Yeah, combining like approaches. So you missed roll with llama with, you know, like. You've got one model on your data ingest, doing something, and then another model on your
45:48 decision. So I'm gonna help you find some stuff. And then another model helping you make decisions based on that stuff it found. Well, cool, based on those decisions, I'm gonna spit some output.
45:59 Yeah. Right, so those mixtures of experts, it's all the same type of patterns just at different scales. So do you think, I mean, like, obviously, say you were working on that stuff, you know,
46:10 15 years ago, doing your PhD, I think even our first
46:16 guest ever, Talal, and he's a good buddy of mine, but I mean, he was doing it back in the 90s at U of H, you know, some AI stuff. But like, I feel like some of this like, the groundwork was
46:26 continuing being laid, but like, they didn't have the compute power or just the engine or the, you know, the horses to do it. And now we finally have it. And I think I've been probably able to
46:36 just get there quickly because leveraging on some of the concepts that have been developed over the years. But now it's like. Yeah, for sure. And just the access to the data, right? 15 years ago,
46:49 there was no, hey, can we go literally consume every, Yeah, everything on the internet. Well, no, we don't, I
46:59 mean, sure,
47:05 we were on ISDN or T1s, but there was no ubiquitous fiber at everybody's house.
47:15 The presence of the Internet alone and how much data you can have access to I think has probably been the biggest, but then certainly. Compute power and yeah, I'm able to crunch through petaflops
47:27 of
47:29 model training is crazy. Yeah, it's really quick, really wild. So I guess getting into that too, I mean, like, even whether it's before PhD or so, how did you kind of get into technology or
47:42 coding? It seems like you took a, probably a more traditional path. Yeah, more traditional path, right? Or even just as a kid, what was your first introduction, I guess? Yeah, so Q Basic.
47:55 on a 386 DOS, you know, dating myself. I feel like one of those old guys, right? Back in the 90s, right? So teaching myself to code, it was cool, it was the new thing. And just kind of
48:11 started from there,
48:13 a traditional computer science, interest in college, and then go into my master's PhD of
48:26 machine learning, artificial intelligence, so that was really neat of, hey, like I said, it's just a fancy algorithm, but hey, we could do some fancy stuff here. Well, it seemed like you were
48:42 even a little bit ahead of your time 'cause what was,
48:45 what, 2012 was the time magazine Data science is the sexiest job of the country. Yeah, but yeah, and even then
48:54 data science was, well, data science is kind of maybe, I mean, it's not going away, but it is being replaced by AI. Now, there's always going to be a place for human experts, but I mean, some
49:12 of those things that AI is good at, finding patterns, discovering, right? It was especially like medical diagnoses, right? It can see one little blip and know that, hey, you should pay
49:26 attention to this, right? Cancer screening and stuff. Even that you talk about mixture of experts. And we talked about it too, like the thing where they said, like, you know, say a
49:34 radiological oncologist can, they're 75 effective and the AI is 88 effective, but then you put them together and now you have like a 95
49:44 Yeah, absolutely, and that's. even that kind of goes back to simple, you know, Bayesian statistics, right? Of you can have any number of cancer screenings and they might be 90 accurate. But
49:58 even if they say yes, the chance that somebody has cancers might be really low just because the statistics of that cancer are low. But yeah, you start to mix these things of, I've got one model
50:11 that's really great at detecting things very early But not necessarily accurately. Sure. But then another model that's cool, now I'm gonna take all of the data about that one person, funnel it
50:24 down and look at just hundreds of data points and then match it with the radiologist, right? Yeah, absolutely. That's gonna be the next thing or I mean it is, you see it now of combining those
50:40 different data sources, not having a one size fits all type of solution.
50:46 Well, great. We always get through this super quick. And Bryce said here nerd out for a lot longer, but we usually finish up with kind of like a speed round where it just could be about tech,
50:55 could be about what kind of food you like, but probably three, four, five questions, and then we'll be done here. Okay.
51:05 But, so what's your favorite, we've been talking about AI was your favorite AI model that you've interacted with?
51:13 It's old school at this point, neural networks that was my favorite one, learning,
51:19 eventually, maybe, right? We can get to where we have some way to kind of model the human brain. Yeah. But that would be awesome. No, for sure, I think I did one of the, I used one of the
51:30 neural networks at one point, like it was a, it's
51:34 not LSD or it was kind of over time series and it was able to kind of detect patterns, like, you know, passing over a time series, but it was like, I learned that and then it was like, Years
51:43 later, it's like, all right, now,
51:46 everyone just kicks them to the curb, but yeah, so, you know, TensorFlow and those were either the hotness there for a little while. But, um, so you're, you're up in college station, right?
51:56 Yep. All right. So if someone's, if someone's going to college station, where do they need to eat? Oh, goodness. Man, we've got a lot of good stuff up there. If you like barbecue, you see in
52:04 J's is the local barbecue place, uh, luckily right there with Texas HANM, uh, we've got a lot of good ethnic food If you want Indian, Mediterranean, Thai, so a lot of good stuff. Yeah, it's
52:21 like a mini version of Houston. Yeah, pretty much, pretty much. Um, all right. Back to tech. What's your favorite open source technology? Um,
52:31 I promise I'm not a fanboy. Yeah. Dot net core. Okay. Yeah. It's a, it's unlocked a lot for you Yeah, when, when they made it
52:43 to where it could run natively on when it. and the Unix POSIX subsystems, they won basically at that point. And the fact that you can use it now to write a Mac OS app or a cloud app or a Docker app.
52:57 Never would have thought that years ago, but it's crazy. I
53:02 remember even though, I think it was right around the time that they bought GitHub, but even before that, I think people didn't realize that Microsoft was the largest contributor to open source.
53:13 They really, when they made that shift, I don't know who made that decision. I have to imagine Satya had a big problem. I mean, they've taken off under him and he's got to have that vision or
53:26 whatever it was. It definitely felt like it was after the bomber era, but that was, they won when they made that decision No, for sure.
53:37 Yeah, that's a good one because even while we had a guy on from Red Hat early on on the show and. And it wasn't, I don't think it would necessarily, it might have been, might be leveragingnet
53:46 core. It probably has to be. But that SQL server actually runs faster on Red Hat than it does on Windows. Yeah. That's hilarious. Yeah. Probably with just less overhead underneath. Yeah.
53:57 Operating system level, but
53:59 let's go a couple more. What's your favorite place to go on vacation? Oh goodness.
54:06 Love the beach. I love Texas, but not Galveston. Yeah The Gulf of Mexico is very brown, in terms of just like the dirt and stuff on the water. Great, nice and warm. Love the beach, white sands
54:20 somewhere. But I've been to Anacy, France twice. Okay. And so that's in the French Alps. Right, I think it's like 30, 60 kilometers away from Geneva, Switzerland. So kind of right there in
54:35 the corner of Switzerland, France and Italy where they all meet. Okay, nice beautiful country. I'm not gonna mad at it now. Yeah. It's fantastic. Sweet, beautiful. And then I guess the last
54:49 one, what's your, well, I think you may have mentioned, what's your favorite database?
54:56 Time scale is many times as it's pulled my bacon out of the pan. Yeah. In terms of performance, it's great. Postgres is awesome. Yeah. Back in the day, it was a kind of a neat little fun toy
55:11 database, but it's a heavy hitter now. No, for sure. I mean, every time I turn around, I've got some extension that makes people's life easier. And 'cause, the thing yeah, was today, I saw
55:20 something, they have something where it'll native replication to
55:25 iceberg or parquet or one of those like on Blob storage, you know, but again, I think it's that power of open source, but I mean, but it's a legit database now, but like there's an extension for
55:35 anyone like on it. Literally anything you want to do. Was it PG rest? automatically creates a REST API off of your Postgres database. Oh, really? Yeah, that's cool. There's some extension like
55:46 that where, like, basically it'll just take all the tables in your Postgres database and create a REST API from it. EF Core on crack? Yeah. So it's, no, it's absolutely a great choice. Well,
55:57 well, no, this was awesome. Thanks for making the trek down, especially on this rainy day from a conversation that you sent in. Yeah, thanks for the invite. It was awesome. But really enjoyed
56:03 the conversation and where can people find you? Yeah, so capturecom And then, you know,
56:11 I have a LinkedIn presence, but yeah. Rather either, you know, go through the capture office and ask for it. Yeah, go to the capture office. Yeah, go to the capture office. Yeah, hit me on
56:21 LinkedIn. I don't even actually remember my handle. Yeah, fair enough. Josh, Joshua Johnston, but. After a little, yeah, I find you pretty easily, so. Awesome. We'll be good. For better
56:30 or for worse. Well, great. Well, hey, everyone, you know, if you liked this episode, make sure you, you know, like and subscribe. And until next time, thanks for listening and thanks for
56:38 being here Yeah, of course, thank you for fighting me.
