Podchaser Logo
Home
551: AI Under Your Control

551: AI Under Your Control

Released Monday, 26th February 2024
Good episode? Give it some love!
551: AI Under Your Control

551: AI Under Your Control

551: AI Under Your Control

551: AI Under Your Control

Monday, 26th February 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Well, boys, I've been chewing on something long enough now that I

0:02

think I'm ready to talk about it on the show. I've

0:04

been trying to think about all this AI

0:06

stuff that's been just thing

0:08

after thing. And

0:10

all of the discussion around safety and then the news that

0:12

we're going to talk about in a little bit with Gemini.

0:15

And I feel like perhaps I've stumbled on to a hard

0:17

truth. I want everybody's feedback, but

0:20

we're always talking about user consent and how much it

0:22

matters. But I don't think we're thinking

0:24

about it in terms that regular

0:27

people care about. I

0:29

think user consent matters and

0:31

hijacking your intentions as a user is

0:33

wrong. Telling me

0:35

something like this isn't appropriate is

0:37

hijacking my intentions. Enforcing

0:40

a particular worldview, I think that's

0:42

unethical. I mean, I agree on some

0:44

of the things that they're trying to protect, like racism. Those are bad. But

0:49

so is cramming a for-profit

0:51

company like Google, cramming

0:53

their ideals and morality down my throat just

0:56

feels like the worst case

0:58

sci-fi dystopian future that we

1:00

worried about. And we have

1:02

these corporations that are setting themselves up as

1:04

like arbiters of reality, of what's right and wrong. When

1:08

morality should be set by the people, not

1:11

corporations or governments. And

1:13

the more I think about this, I think this

1:15

is just going to be a hard reality because

1:17

all these companies are so risk adverse. Yeah, they're

1:20

not really set up to be speaking truths. They're

1:22

there to set up to be to incentivize their profits and

1:24

say things that won't get them in trouble. Yeah,

1:28

they're not necessarily doing a good job of that, but that's what they're

1:30

trying to do. But like if I run it

1:32

on my system and I have control over it, it's on

1:34

my Linux box, those aren't my

1:36

priorities. And I think if this is a hard

1:39

truth that we kind of accept, then really the

1:41

future for interesting AI that's

1:43

going to be pushing the limits and for

1:45

good and bad. And we'll see stories around

1:47

both. I think it's going to be

1:49

open source. I think it's going to be

1:51

the open source stuff. And that was really the motivation for today's

1:53

episode. Hello

2:06

friends and welcome back to your weekly Linux

2:08

tech show. My name is Greg. My

2:11

name is Will. And my name is Brent. Well

2:13

hello boys. Coming up on the show, you know what

2:15

they say, if you want it done right, you gotta

2:17

do it yourself. So this week, we're

2:20

gonna see if we can tame the moving target that is

2:22

the current AI stacks out there that you might want to

2:24

try. We think maybe we have

2:26

found one of the best ways possible to

2:28

deploy these AI tools under your own control

2:30

on your box. So stay tuned and find

2:32

out for that. And then we're gonna round the show out

2:34

with some great boosts and picks and a

2:36

lot more. So before we go further, let's say good

2:38

morning to our virtual lug. Time appropriate greetings, Mumble Room.

2:40

Hello Brent. Howdy.

2:43

Hello Brent. Hello.

2:45

Hello. Hello and

2:48

a big good morning to our friends

2:50

over at Tailscale, tailscale.com/Linux Unplugged. We're

2:53

playing around with these tools. We're deploying it on our tail net. We can

2:55

all access it by name. We

2:58

don't have to worry about security vulnerabilities in these brand

3:00

new projects because we never put it on the public

3:02

internet. I don't really deploy anything on the public

3:04

internet anymore. It is the easiest way to connect

3:06

your devices and services to each other wherever they are regardless

3:09

of the platform. And it's fast. Go

3:12

try it on 100 devices

3:14

and support the show at

3:16

tailscale.com/ Linux Unplugged.

3:20

All right. So this week I think if we

3:22

could have one overarching goal for the show, I

3:24

would love to at least get the listeners thinking

3:27

more seriously about open

3:30

source AI tools and less about

3:32

things like chat GPT and copilots

3:34

and things like that. We're

3:37

gonna talk about the news that's come up this week in just a little bit.

3:40

So the tooling has come a long way for

3:42

the open source stuff just on the web and

3:44

communities, the stuff you can install, and

3:46

also some of it's gotten complicated and some of it's broken.

3:49

And I think the other reality we're sitting with is

3:52

these commercial solutions, they're so risk

3:55

adverse that it's almost reducing the

3:57

usefulness of the

3:59

product, right? It's embarrassing really. It

4:01

worries me too because that's just more of the –

4:04

it's already an imbalance because only these giant companies seem

4:06

to have the resources to sort of train

4:08

the foundational models and of course they're

4:10

the ones applying all of these filters that end up

4:13

harming the actual productive use and so if you can't

4:15

say anything – I mean let's not

4:17

even say offensive but just sort of creative, weird, out

4:19

of the ordinary, then they're left with

4:21

the only sort of access to these tools

4:23

that they can tune down those filters internally

4:25

presumably and then they can make any interesting

4:28

stuff. That's a good point. Yeah,

4:30

they're the only ones really. They become the gatekeepers

4:32

essentially of this and this is why

4:34

I sort of feel like we have a

4:36

limited time to win the race in the open source community.

4:39

I hope to make the case here in just

4:42

a moment that there are several factors that

4:44

are working not necessarily together

4:46

but the result is they work together to

4:50

limit the usefulness and

4:52

utility of the open source stuff and

4:54

I want to cover all of that in a moment. So

4:58

first we have a couple of events that are

5:00

just around the corner. We are three Sundays away

5:02

before we hit the road, boy. Yeah.

5:05

Okay. Three Sundays. I'm

5:07

sure you're totally ready, Brent, right? Because I mean – Oh yeah, yeah.

5:09

Yeah, always. You know me. I'm early on

5:11

the spot. I'm not worried about that at all. He

5:13

committed on here. Yep. Multiple times for weeks

5:15

in a row. And I always end

5:17

up there. Yeah, that's true. That's true. Just

5:20

how much did I stress about it is really it.

5:22

We are getting so excited about the very first NICScon

5:24

North America. We can't wait to bring you coverage

5:26

of that. That is co-located with

5:29

scale in Pasadena, California. It is the

5:31

Linux Expo number 21

5:33

for Southern California. It

5:35

all kicks off March 14th. We're hitting the

5:37

road just like a day or two before that. If you want to

5:40

sign up for scale or go to NICScon, go to them both, get

5:43

a scale pass and use our promo code

5:45

JBFTW, JB for the win to save

5:47

50% off. And

5:50

then join us at lunch Saturday at 1.30 pm at the yard

5:52

house. We want to see you there. And

5:54

then of course, go grab fountain.fm because

5:57

they are creating a totally upgraded live

5:59

experience. for us in app and

6:01

on the web. Ooh. They're working on a

6:03

cool web app that they've prototyped for some

6:05

of the live value for value music streaming

6:07

events. Okay. That can toggle between audio

6:09

and video. You can use it in the app or

6:12

you can use it on the web. Nice. Well,

6:15

I wanna see this. Oh, it's gonna be great. So

6:17

we're gonna have a whole batch of live streams, one on

6:19

the 12th, the 14th, the 15th, and the 17th, and you don't

6:21

have to really worry about any of that right

6:23

now, but we're looking forward to it. And I think, depending

6:26

on how all this goes, we're kinda gonna be defining a

6:28

new live experience for the show and

6:30

for future events. So I think it's gonna

6:32

be kind of, I'm making a break at Trend Center

6:34

for us. We're kicking off the first

6:37

event, bigly, bigly, as Wesley

6:39

says. That's right. Okay, well that's kinda

6:41

all I had, announcement-wise. You

6:43

feeling good? I am feeling good. Did you decide you're

6:45

flying back? I am. Yeah,

6:47

for the work, right? Mm-hmm. For your hashtag, J-Jog?

6:50

Yeah, get back to the regular things. Yeah, that's right.

6:52

We're gonna take Brent maybe somewhere special than just to

6:54

make you jealous, but. Dang it, I knew this would

6:56

happen. I'm gonna have to fly back to then visit

6:58

for that part of it. What are the options? Brent,

7:00

you know what I'm gonna be advocating for. I don't

7:03

know how we're gonna make it work. That's

7:05

what you said last time. I'm gonna be advocating for the coast.

7:08

Oh, this is exactly what you said last time.

7:10

I never have been. I know, I know, I

7:12

know, I know. I'm aware. But we haven't booked

7:14

any of our return Airbnb-bizzles, so. Bring

7:17

it on. Yeah, you've got flexibility. We do, maybe

7:19

we just do like hotels or something, I don't

7:21

know, but I would love to take

7:23

the coast. I think one of the prettiest

7:25

places, that west coast is one of the prettiest places in the

7:27

world. It's really something. We'll

7:30

take pictures though, Wes, we'll take pictures. All

7:34

right, so you probably heard the news this

7:36

week about Gemini just being embarrassingly bad. As

7:38

the register put it, Google

7:41

halts Gemini image generation to fix a

7:43

white balance. They

7:46

go on to say Google has

7:48

suspended the availability of text-to-image capabilities

7:50

in its recently released Gemini multi-model

7:52

foundational AI model after

7:54

it failed to accurately represent white

7:56

Europeans and Americans in specific historical

7:58

contexts. headline which

8:00

I thought was really on point quote

8:02

big tech keeps poisoning the well without

8:04

facing any consequences for its folly. Well

8:08

I say today here on this show they

8:10

face the consequences gentlemen. Today we

8:12

say this far no farther the

8:15

line shall not be crossed we

8:18

are going sovereign with our AI technology and

8:20

I was excited to see stable diffusion 3 came out. Big

8:23

fan of stable diffusion image generation stuff. I'm

8:27

not so sure though the hosted version is really

8:29

going to hold up in

8:31

their announcement. They write

8:34

we believe in safe responsible AI practices

8:36

this means we've taken and continue to

8:38

take responsible steps to prevent the misuse

8:40

of stable diffusion 3 by bad actors.

8:43

Who bad actors are though I'm not sure it doesn't say I

8:45

guess that's they define who the bad actor

8:48

is. Oh yeah of course well that's definitely

8:50

one of them. They continue safety

8:52

starts when we begin training our model

8:54

and continues throughout the testing evaluation and

8:57

deployment. In preparation for

8:59

this early preview we've introduced numerous

9:01

safeguards by continually collaborating with researchers

9:03

experts and our community. We

9:06

expect to innovate further with integrity as

9:08

we approach this model's public release. So

9:11

all of the talk right now really is leading

9:13

with responsible AI practices

9:16

and safety. Perhaps as they should

9:18

be but it's definitely the focus and

9:20

I thought commenters on Hacker

9:23

News nailed this when

9:25

they were talking about how it's basically making

9:27

the products less useful. Their

9:29

sensitive nature is causing the rest of us to

9:31

just want to look for something that we can

9:33

self-host and run because I

9:36

get offended by them projecting some

9:38

intentions that they manufacture onto me. Yeah

9:41

well and I mean you just you're kind

9:43

of left out you know it hasn't

9:45

been a perfect system but by and large you

9:47

kind of get to decide you know

9:50

if you're gonna repost something if you're gonna share it

9:52

if you're gonna use your speech to have it and

9:55

you're out of that and you're sort

9:58

of implied like well you could do anything with this,

10:00

even though you might just want to make a silly limerick

10:03

right there in the app and you'll never use it again.

10:05

It kind of feels to me like this is potentially

10:07

going to create another divide that we've

10:10

seen in computer

10:12

privacy and data privacy. Those of us

10:14

who are technically sound enough to host

10:16

our own stuff will have far

10:19

more privacy than those

10:21

who just don't have those skills. This

10:24

feels like it's headed in that exact

10:26

direction. It's also alarming. The

10:28

point that the reg makes is actually well taken.

10:30

It's like they keep screwing it up, but

10:33

because we have no other choice or no other

10:35

viable choice, they face

10:38

no consequences for their folly.

10:41

You think about something like this that had to ship, how

10:44

they missed something like this in testing. We

10:46

were sharing screenshots this morning, I was, because throughout

10:49

the weekend, people are just trying various different things

10:51

that you think would be easy for it to

10:54

answer and it can't answer them.

10:56

It sure seems like it is kind of like end

10:58

of the pipeline. They're

11:01

not necessarily being able to fix or address the

11:03

systemic problems in their data sets. Then at the

11:05

end, they're kind of just like, oh,

11:08

well, we'll apply some little fix to try to

11:10

tweak it to make sure it's sufficiently diverse or

11:12

whatever the – like correct for

11:14

other things that they want to make it generally

11:16

more accurate, but then you're doing it so late

11:18

in the process that you get crap results. Wouldn't

11:20

it have to understand

11:23

to not answer? I think it depends on what

11:25

layer this is being applied. But

11:28

your point taken is it must realize the

11:30

truth in order to generate something that is

11:32

either the opposite of the truth or to

11:35

say it can't answer it because of moral reasons. It

11:38

must have some context awareness then of the

11:40

actual question underneath, but then you're right, the

11:42

presentation layer, they're restricting.

11:44

Well I just mean like what

11:47

is implied or learned from

11:49

the data sets that are being selected and

11:51

then they're just trying to apply things at the end that are

11:54

going to be very coarse grained and are not at the level

11:56

that a giant data set has. And

12:00

I think at the same time as

12:03

we have this sort of ridiculous over

12:05

safe, we had an example on Coda Radio. Somebody

12:08

wanted to know the absolute fastest way to

12:10

do something in C-sharp and

12:12

the LLM would not respond because it was

12:14

too risky and dangerous to do something just

12:16

for speed and not for good code quality.

12:19

Couldn't tell you the answer. That's where it feels

12:21

like it's another level. I think probably a lot

12:23

of us, especially for hosted services, I

12:26

expect there to be some level of this. If

12:28

I'm just generating obvious hate speech, I could see

12:31

it probably being reasonable for the

12:33

general public of something that's like, we're

12:35

not cool with this from our product for you. And

12:38

if it's an edge case, it's fine.

12:40

It's a reasonable limitation. But right

12:42

now, it feels like every other thing that you

12:44

ask these machines to do, you get back a

12:46

little slap in the face. Sometimes

12:49

it's even like – it's like a parody of

12:51

a brand name or something. It's like, oh,

12:53

we can't really let you speak ill of the Burger

12:56

King. Or

12:58

like public figures, which an

13:00

artist could whip up and it would be totally

13:02

free speech. It's an important part of our public

13:04

discourse. And then at

13:06

the same time, you have a group out there

13:09

that Politico has labeled as AI doomsayers

13:12

that are funded by billionaires that are

13:14

ramping up lobbying right now as we

13:16

record. So two

13:18

different nonprofits, the Center for AI Policy

13:20

and the Center for AI Safety funded

13:23

by different tech billionaires, have begun directly

13:25

lobbying Washington to address what they perceive

13:27

as existential threats posed by AI. So

13:31

each nonprofit has spent around 100,000, north of $100,000 on

13:33

lobbying efforts of both of them. Specifically,

13:38

they spent that money in

13:40

the last three months. And

13:42

they're drawing their funding from different organizations throughout

13:45

the industry that are tied to AI. And

13:49

the part that I love is like Brent's

13:51

good buddy, Sam Bankman, fraud,

13:54

both nonprofits have ties to the effect of

13:56

altruism cause, which has

13:58

been an absolute – I

14:01

mean I just think it's been absolutely disgraced with

14:03

folks like Sam Altman and others. And

14:06

these folks all have what, in my opinion, is a

14:08

God complex where they believe they have to save humanity.

14:10

So they create problems and then they panic about them

14:12

and pretend like they're the only ones that can save

14:14

us from them. And that's

14:16

how they get off, is rich billionaires, is saving

14:18

the world from these existential crises that they're very

14:21

funding created in the first place. The

14:24

efforts to influence Washington on AI were

14:26

conducted through going directly

14:28

at staffers and think takes.

14:31

That was really the approach they took before. And we

14:33

got some executive orders

14:36

here and we got some action from a

14:39

couple of senators, but not much. Nothing really formed any

14:41

kind of law. But so now they're going with the

14:43

full-on lobbying effort. And essentially

14:45

what they're pushing for is in order

14:48

to work on AI projects, you need to be

14:51

licensed. And you need to

14:53

be essentially re-licensed every couple of years to make

14:55

sure you're not creating dangerous AI. And

14:58

then the projects you're working on need to be checked and

15:00

audited by a group, an industry

15:02

group. And they even want

15:05

to have restrictions around the power and size of

15:07

the models and whatnot is what

15:09

they're advocating for. And they

15:11

want this implemented at a law level

15:14

from Washington, D.C., here in the States. I'm

15:16

not against any kind of regulation for this kind of stuff.

15:18

I'm sure at some level, especially as the scale picks up,

15:20

we'll need some. But boy do I

15:22

– it does not seem like A, we're long enough

15:24

into it to really know what the effective regulation should

15:26

be. And then B, just the – I don't know

15:28

that our current system is really set up to –

15:32

it's all going to change a lot. So whatever

15:34

we need to do, the queue would be get

15:36

something reasonable and then something nimble that you would

15:38

actually update as these things develop. And then I

15:40

have not a lot of faith in. And something

15:43

that didn't require millions of dollars of lawyers to

15:45

nursemaid through the system, which it would seem these

15:47

systems do require that, which almost means you're just

15:49

– If we not lock in the incumbents already? Right. You're

15:52

just locking in the incumbents, right?

15:54

And you're making it Particularly difficult to

15:57

use Open Source AI Tools in

15:59

Business. In Business Development firm for

16:01

development. You. Know if you want to run a

16:03

little model on your little jeep you, that's. Why?

16:06

Don't go building something that's gonna be running as

16:08

part of a software. The Service: Sort.

16:11

Of same thinking. That reminds me enough

16:13

that the folks to get prosecuted for

16:15

hacking when they download a file with

16:17

her on a yeah in a public

16:19

school teacher yeah yeah it's it's information

16:21

and I think that's why I I

16:23

want to advocate to our listener base

16:25

or even if you're not currently wrapping

16:27

your heads around this. Just

16:30

stay of informed a little bit would give you some

16:32

resources on an hour and actively soliciting more from you

16:34

out there in the audience because. Feel. Like

16:36

we have kind of the race to win here to

16:38

degree to get to certain. User.

16:40

Base size. Where. They can't just completely

16:42

ignore it and or and at a legislator level.

16:45

Or if we got to a certain size adoption

16:47

before it goes too far then it's like they

16:49

the as a bottle. And so

16:51

I feel like one of the things they show could do

16:54

to help contribute to that is cover the tooling that is

16:56

accessible to all of us. And. Help

16:58

you deploy in ways that are consistent. On.

17:01

That. May. Be are nimble like West said,

17:03

because the thing is changing really fast. So.

17:06

I think you know. That are

17:08

we focus on next. Warped.

17:12

Dot Dev/linux Das

17:15

Terminal. And. That's right,

17:17

warm the modern rust base

17:19

terminal. With. A I

17:21

built in. Is. Now available for

17:23

Linux, it used only be out there

17:25

for a Mac O S users, but

17:27

it's available for linux on multiple distributions,

17:29

including a boon to for Dora Debbie

17:31

and an arch. You. Can give

17:33

it a try to stay at Warp.does/linux

17:35

Das terminal. It is something that feels

17:37

like they rethought the terminal for the

17:39

modern era. If you work in the

17:41

corporate world where there's Mac and Windows

17:43

and Linux systems, you're going to feel

17:45

like you have a tool that just

17:47

keeps up with the rest. This.

17:50

Is really slick because eight built on resting

17:52

on quick. But. It has

17:54

a really nice modern text editor built

17:56

in so when you're editing your your

17:58

yam of your docker compose. He

18:01

drinks on whatever it is. you know, got a modern

18:03

text editor built in. And it lets.

18:05

You navigate through blocks of output really

18:07

quickly. V. For it man, there's that

18:10

warp ai that you can invoke and it

18:12

will suggest the right command for you. This

18:14

is what I've been playing with and it

18:16

is really handy is also just the ability

18:18

to customize the U Wot. Yeah.

18:20

Know you to do that. You can create your own

18:22

prompts to and. Have. All the nice

18:25

thing set ready to go see them. Recall those when

18:27

you need them. It's a great user experience. Have a

18:29

collaboration feature in their. All. Of

18:31

it is fantastic for developers

18:34

are engineers. Who.

18:36

Are it'll work in our living in that Turner. I

18:39

sometimes looked over at some of the other criminals on the

18:41

commercial platform than it's been a little jealous. And

18:43

now now now not anymore. Go.

18:46

Check it out!

18:49

Warped.dev/linux-terminal. Now. Available for

18:51

linux. including. A boon to For

18:53

Door Debbie and Arch Linux and about a

18:55

lot more coming soon. You know how those

18:57

things work out. Go check it out. Give

18:59

it a shot. Support the show by going

19:01

to Warp.deaths Last Linux, Das Terminal. And.

19:04

Give it a try to deck. Warped. Attempts

19:06

Last Linux Das. It's

19:12

not all bad news though. I mean

19:15

that the same time that these are

19:17

big corporations are gonna in uncertain their

19:19

foot in public they are also really

19:21

seem free and open. Wait. Models.

19:24

That didn't go play with like a Googles

19:26

New Gemma. I mean, this is pretty nice

19:28

to see it this soon after a Gemini

19:30

comes out and die. It's a two billion

19:32

parameter model. Wow. Two sizes. The two billion,

19:35

Seven billion parameters. Available now.

19:37

They do play like all their safety stuff

19:39

to it. So yeah the know it's facilities,

19:41

different models you can grab. We talked about

19:43

this before they all whatever institution makes them.

19:45

it comes with that institutional bias. There is

19:47

that aspect to it. but. What's.

19:49

Why are they doing this? Why are they taken like these?

19:52

These. Models are kind of behind all of this work

19:54

they've done. And. Released into the

19:56

public? Is it? Isn't. Like this like a

19:58

good hub idea where you make it. Three So

20:00

developers can I get hooked? A is it.

20:02

and I guess I don't really understand the

20:04

motivation for all of these models coming up.

20:07

Specially. When googles like trying to get people to pay

20:09

for. An. Outrageous amount for Gemini. Yeah.

20:11

I may as well the one ended his

20:14

to help mindshare adoption I think because the

20:16

scenery and has been the area of them

20:18

is academic interest for a long time and

20:20

than is in the last few years sort

20:22

of blossom now to been brilliant applied science

20:24

and industry in there is sort of a.

20:28

Reputation will aspect of i'm publishing what you're doing

20:30

like whether that be a paper and then a

20:32

lotta times now the paper and the model behind

20:34

it and maybe they just feel comfortable that there's

20:37

enough you know special sauce that they can sort

20:39

of keep on their own plus celtics resources and

20:41

those sort of get to see up you and

20:43

I can get this to work and tied into

20:46

stuff but. The hollow level the building

20:48

into a nice service and I hadn't thought about

20:50

the historical kind of momentum around releasing the stuff

20:52

to as sort of educational background am. I

20:55

speaking that a llama. Which. we've talked

20:57

about from the show is already incorporated. the

20:59

so Gemini Gemma comes out and then within

21:01

a day or so. The. All

21:03

along folks are already incorporating it into the

21:06

free stuff. is moving really fast. Yeah Cuomo

21:08

seems to be don't play nicely and see

21:10

they have a rest A P I did

21:13

get I think chat G B T style

21:15

A P I compatibility. There's also some by

21:17

Indians like Python bindings probably javascript bindings as

21:19

well so I can easier and easier to

21:22

adopt and build into your apps if you're

21:24

so what do So I thought we should

21:26

focus on image generation this episode. There's a

21:28

lot of different tune out there but all

21:31

of the hoopla around Gemini has been with.

21:33

Image. Generation. And. We've talked

21:36

about easy, stable diffusion before. It's a

21:38

docker container that you can get up

21:40

and running pretty quickly. This.

21:42

Week though, I want to talk about invoke a I.

21:44

Invoke. A I is and implantation of

21:47

stable diffusion. That. Has

21:49

along with it a great set

21:51

of to lean and streamline process

21:53

to generate images. Polian updated models

21:55

or modifiers. and work

21:57

to generate images that whatever you

21:59

want really depending on the model using. We've been

22:01

experimenting with it on and off for the last few

22:03

days. We'll be talking about that. But the

22:06

reality is there's a

22:08

lot of ways to install something like this. And

22:11

I've noticed a lot of bad

22:13

habits. Invoke AI in

22:15

particular is guilty of this. A

22:18

lot of like just run this script. And

22:21

it just dumps stuff all over your

22:23

system and running stuff and I mean, I

22:25

think also some of the you know, just

22:27

the way these the history maybe some of

22:29

the coming out of academia. There's a lot

22:31

of specific stuff. There's different package managers. There's

22:33

a lot of specific dependencies that you need

22:35

for your particular hardware. Sometimes it's like sensitivity

22:37

to which version is going to work. You

22:39

know, new ones won't or way older ones

22:41

won't you need this set over here matched

22:44

with this one over here. Plus,

22:46

as you're saying, there's kind of a lot of

22:48

stateful stuff where you're running scripts to get everything

22:50

just so set up and then it's kind of

22:52

based on whoever happened to tie all those things

22:54

together. And then some of it's kind of brittle

22:56

because it's dependent on certain like video acceleration libraries

22:58

working and being there installed correctly. And you end

23:00

up with a bunch of stuff scattered plus then

23:02

there's the web UI is the goal along with

23:04

these things. So you got a separate sort of

23:06

JavaScript app probably that you got to you got

23:08

to handle and build and then have a web

23:10

server for seat and then because you know, you

23:12

can't just trust them that there's not like a

23:15

unified system for it. Everyone's gonna run their own

23:17

thing and you know, install Apache or add to

23:20

your engine X or Oh God, you're making my

23:22

blood pressure go up just talking about it is

23:24

really a mess. And then you also have like

23:26

Docker containers. Of course, those can be

23:28

loaded full of some stuff but not have other things

23:30

that you need for your particular hardware or like if

23:32

you deploy it on the Mac, you won't get access

23:34

to hardware acceleration in some cases. So

23:37

there's there's a lot of edge cases around

23:39

the containers and I haven't really seen any

23:41

of this stuff shipped as like

23:43

a flat pack yet. I've been waiting for

23:45

something like invoke AI and stable diffusion to just be a

23:48

flat pack and you install it, it starts a little web

23:50

server gives you the URL and like a little pop up

23:52

and you click that and it brings

23:54

it up but so far that's

23:56

not happening. And so you have containers with

23:58

their limitations sometimes they'll work. great be exactly

24:00

what you need and sometimes you're

24:03

gonna be executing into that thing and fixing

24:05

stuff and adding stuff so it works on

24:07

your machine. You got the blast and spray

24:09

and pray method which is what

24:11

the Mac users seem to be just going for

24:13

which I don't understand you know

24:15

there's other ways you could do it and

24:17

then of course you could

24:20

do it from scratch and you could actually build it pull

24:22

down everything man is that a job. Yeah right you can

24:24

kind of follow the developer path and be like okay if

24:26

I was gonna be working on this project what all would

24:28

I have to do so you can guess we

24:31

didn't want to do it this way you could probably

24:33

guess from our tone this is not how we did it

24:36

and we think maybe you shouldn't do it those ways

24:38

either we think maybe we have a much

24:40

better way to do this. Determinate.systems.unplugged.

24:47

Bring Nix to work the way you've

24:49

always wanted with flakehub.com. Go register

24:52

for the private beta and

24:54

get secure compliance friendly Nix

24:56

and all the supports you

24:58

need at determinate.systems.unplugged. So let's

25:00

talk about flakehub.com. It's the all-in-one platform

25:02

for secure compliant transformative Nix

25:05

development and it is made by

25:08

our friends at Determinate Systems the

25:10

builders of the best Nix installer

25:12

including the massively popular MDM

25:14

and skiff mode support which gets

25:16

20,000 installs per day massively

25:18

successful and they have an

25:21

advanced binary cache with sophisticated access controls if

25:23

you're working in a team is going to

25:25

be mind-blowing for you they're also the ones

25:27

behind that cloud-native Nix project we've talked about

25:29

and just a plethora of open source tools

25:31

and documentation for the Nix community and flakehub.

25:35

Flakehub is solving big problems if

25:37

you're building with Nix. Nix caching

25:40

traditionally involves a globally flat namespace

25:42

so imagine building

25:44

software it's like having a massive toolbox with tons

25:46

of parts and tools in there but

25:49

finding the right one can take forever when it's a mess and

25:52

auditing everything that's in there that got pulled in

25:54

it's just well it's a nightmare if maybe not

25:56

even impossible traditional

25:58

Nix caching is like having multiple toolboxes

26:00

scattered everywhere. And each team member

26:02

has their own set, and

26:04

it's hard to share the tools or know who has

26:06

which one, and if they have the right one, it's

26:10

not really elegant. Well,

26:12

FlakeHub is. It's a single organized toolbox

26:14

for your entire company, if you will.

26:17

Imagine one identity-aware cache where everyone gets

26:19

the right tools based on their permissions,

26:22

no more searching through that mess with fine-grained

26:24

access control so you make sure things stay

26:27

sensitive and they're in the right spot. And

26:30

FlakeHub has that globally distributed

26:32

build cache. Man, has

26:34

that been useful here when we've been

26:36

working with AI tools. If you can

26:38

use GitHub, FlakeHub will fit into your

26:40

existing DevOps workflow. And it's the only

26:42

SOC 2 certified platform for Nix, and

26:45

it delivers a low-friction Nix development

26:47

experience that you've always wanted. Go

26:50

bring Nix to work the way you've always

26:52

wanted with flakehub.com. You've got to

26:54

check this out. Go support the show and register

26:56

for the private beta to get secure, compliant-friendly Nix,

27:00

and all the support you need. You

27:02

go to determinant.systems.unplugged. We'll have

27:05

a link for that in the show

27:07

notes too. That's determinant.systems.unplugged. And

27:09

a big thank you to Determinant Systems for

27:11

sponsoring the Unplugged program. Determinant.systems.unplugged.

27:19

Now, if you've been following the show lately, you

27:22

might realize that we have a favorite way of

27:25

solving this problem. We

27:33

wanted to find out if we could Nixify our AI tools, and

27:37

we didn't have to go very far. Sometimes

27:39

it's easy. Nixify.ai is a place you

27:42

can start. They are trying to just

27:44

make things simply available, a large repository

27:46

of AI executable code that

27:48

might be impractical or hard to build or figure

27:50

out yourself. They've got it. They've got

27:52

it over there. I think we saw it kind

27:54

of float by. Yeah. Some

27:57

of our Nix friends were sharing it around sometime last year,

27:59

but we actually. hadn't had a chance to give

28:01

it a try. And I mean, yeah, it's exactly

28:03

what we want, at least if it

28:05

works. Here's how they pitch it. Each project

28:08

is self-contained, but without containers.

28:11

By the way, when we say Nix, we don't mean Nix OS.

28:14

It would work on Nix OS. But this

28:16

will even work in WSL. You just need to

28:18

get Nix, the package manager, installed. So

28:20

you got self-contained applications. It's easy

28:23

to run. It's like

28:25

a one command to actually install and run it

28:27

once you've got Nix set up. Their

28:29

projects have support for NVIDIA and AMD

28:31

GPUs. So it'll work with other. And

28:33

like I mentioned, you WSL users

28:36

are not left out. And I recognize there's more

28:38

and more out there. Go get

28:40

Nix OS WSL, and then you

28:42

can run GPU accelerated AI workloads

28:44

on WSL using these commands. So

28:47

you can get it right there on your winner's box if you want. Now,

28:50

Wes, am I correct in

28:53

saying that they're using flakes under the hood? Yeah,

28:56

I think that is the primary method that they advertise. If

28:59

you go on the site and you want to go

29:01

get started, let's say

29:03

you've got an AMD system. You do

29:06

Nix run. And then basically a GitHub

29:08

flake reference that points you over at

29:10

their Nixified AI slash flake repo. And

29:13

then you tell it you want either the AMD or

29:15

the NVIDIA version, whichever project you're doing, like

29:18

we were doing in Invoke AI. I want to ask

29:20

you how you got it working in a moment. I

29:22

did the spray and pray approach on a demo system.

29:25

I'm not proud of it. And that's why I don't recommend it. But

29:27

a flake is perfect for this kind of thing. I

29:29

think you could think of a flake as it's actually

29:32

pretty simple. Think of it as like a building block

29:34

for Nix project. Imagine a

29:36

flake is a self-contained box, and it has code

29:38

instructions for a thing. And

29:40

inside that there's a file named flake.nix. And

29:42

it tells Nix what to do. It'll say,

29:44

here are your inputs. These are the other

29:46

bricks you need to build the full wall.

29:49

And here are the outputs. This is what it should look

29:51

like. This is how things should be set up. These

29:53

are the parameters. And then you

29:56

can combine these to create bigger and

29:58

bigger systems. inputs

30:00

you put in your config file. And

30:03

each flake has a lock file to ensure it's always

30:05

using the same exact version of everything. That

30:07

makes it reproducible and reliable. So that's really

30:09

big from an audibility standpoint, or maybe you're

30:12

deploying an appliance, or maybe you're

30:14

just trying to get something to work that worked on this

30:16

machine, to work just like it did on another machine. And

30:19

that's, I think, particularly where it's useful for these

30:21

types of tools. They've got a lot of brittle

30:23

parts and are moving quick. So

30:25

you could think of, like, regular Nix is, you're

30:28

building everything piece by piece with bricks, and

30:31

you're assembling those bricks, and you're building the

30:33

wall. With flakes, they're pre-made

30:35

chunks of the wall, with

30:37

clear inputs and outputs. They're like sets

30:39

of Legos that you can grab and set down

30:41

a whole set of Legos all at once.

30:44

Yeah, you know, if you've ever done anything with,

30:46

you know, like PIP or NPM, or any of

30:48

these sort of programming language specific installers

30:51

and package managers, you know, they've got these lock files,

30:53

and they kind of let you manage everything and make

30:55

sure you do things in chunks that you can comprehend

30:57

and you've tested, and okay, I do want to update,

30:59

update them all for me, or, you know, leave them

31:02

pinned for now. Flakes and Nix

31:04

kind of lets you take that to all

31:06

of your packages. You know, it extends that approach

31:08

to the whole system, and

31:11

it lets you easily integrate with just

31:13

GitHub repos, whatever Git URLs you want

31:15

to use. And like you're saying,

31:17

this stuff's all moving so fast. You might have

31:19

custom versions of stuff, forked packages, different versions of

31:22

the web UI. You're not necessarily

31:24

going to want to wait for that to

31:26

like get into a centralized Nix packages repo.

31:28

Or any distro. Right. Yeah.

31:31

With the inputs, you know, the Nix flake

31:33

stuff goes and handles going and checking

31:35

what is the most recent commit, grabbing that, writing it

31:37

in your lock file, pinning it, and then getting you

31:39

all that stuff onto your system and then feeding it

31:42

into the rest of your Nix stuff. So

31:44

all that is just taken care of for

31:46

you. Now, late in the night, we

31:48

all tried our various methods, but Wes,

31:50

of course, of course did the Nixified

31:53

approach, and got it working.

31:56

Did you use a flake? Did you use a different installation approach?

31:58

How did you get things working on Nix? No, I

32:00

just went with the Nixified AI stuff. I

32:03

did see that Invoke AI has a

32:05

Flake.Nix in their repo, but

32:07

it's mostly just set up for getting the raw dependencies.

32:10

So it will get like handle all the CUDA stuff.

32:12

I think it built OpenTV, so it took a

32:14

little while, but thankfully that rig has plenty of

32:17

CPU. Shout out to our

32:19

audience that donated very nice servers. I think that

32:21

would be a nice setup if you were explicitly

32:23

trying to develop on it, because you'd kind of

32:25

get all the different Python libraries and the stuff

32:27

you needed and then you could just do the

32:29

build yourself right in that environment. But

32:31

if you want something that's a little more end user

32:33

focused and packaged, that's where the Nixified AI version came

32:35

in. I think they had some caching

32:38

in place, which was nice. So you didn't have

32:40

to build absolutely everything. Right. They

32:42

talk, you can do the Nix run, but you might

32:44

want to do Nix shell, because

32:46

there's more than one binary for some of these.

32:49

So in particular for Invoke AI, you've

32:51

got just the Invoke AI command, which does like

32:53

a command line version, targeted

32:55

more advanced users, maybe for like scripting or automation

32:58

or that kind of thing. But you

33:00

probably want the Invoke AI dash web command, because

33:02

you're trying to run the web UI. Yes, right.

33:05

And I think by default, maybe there's one it

33:07

comes with, but there's a whole bunch of different

33:09

models this thing can use. So you're going to need to

33:12

do some configuration. And most importantly, you're going to need to

33:14

actually pull down all that stuff. You're going to want to

33:16

run some stuff before you get it all launched, where

33:19

they have a configuration command you can run. It's

33:21

even got a little sort of mouse enabled

33:23

command line interface that you can kind of

33:25

click through either with tab and spacebar or

33:28

double click right there. Let's

33:30

you select what model, some of the options, some of the

33:32

enhancements further on in the pipeline that you need. Yeah, it

33:34

makes it, it sounds like it's a lot of work, but

33:36

that part of it makes it really simple for anybody that's

33:38

even new to this. Yeah, I didn't super know what I

33:41

was doing. So I just kind of clicked some of the

33:43

ones. I was like, yeah, that sounds good. And for the

33:45

most part just worked. Does take a

33:47

little while, because there's gigs and gigs

33:49

of data to download, depending on how

33:51

many models you choose and which ones.

33:54

But It downloads it. You don't even need

33:56

to restart anything. The Web: UI will pick up the new

33:58

models. There's even a little button. There to refresh

34:00

and a. They've. Gotten so and

34:03

I are you got a web you i that

34:05

lets you generate images and then you can play

34:07

around with the different plumbing than the different models

34:09

behind it and the different accelerators that they have

34:12

to or get everything twitches right. And.

34:14

You can start creating your own pictures, and you know

34:16

what stands out to me so far about invoke as

34:18

it's you can tell it's kind. Like

34:21

there's workflows that they have in mind that there's

34:23

all kinds of like you know set up here

34:25

so you can see the accused. it's going to

34:27

horns right you can have it on. We could

34:29

have no sorry and rapid just to do some.

34:31

It was really neat, is. We. Could

34:33

share a central invoke a I server and we

34:35

could all have different images. Been in the queue

34:37

and I can see yours and you could see

34:39

mine in. They would just be good generating in

34:41

the order they were submitted. Yeah and you can.

34:43

You can set up different workflows on here are

34:45

to have been out. Knock me be competitive with

34:47

everything but you can see that you're getting a

34:50

little more sophistication than just like you know. secures

34:52

the input form and and go run the model

34:54

for me. Yeah, So. I'll tell

34:56

you it obviously I use this if you

34:58

go to Jupiter.to you can see how we

35:00

use this I have. For me it's like

35:02

generating stock photography instead of paying for some

35:04

stock photography. website. I. Just generate my own

35:06

now. Are but. I.

35:09

Also. I've. Used this now

35:11

to generate backgrounds for my kids devices

35:13

so they have like custom background from

35:15

dad. My. Wife is used to

35:17

generate like her. Perfect. Wallpaper,

35:19

For her eyes on So she has like this photo

35:22

the she just loves to look at that isn't in

35:24

the she kind of created an of but in their

35:26

great for and there's just ways you can use it

35:28

that are not necessarily. You. Might think of at

35:30

first but the more you have these kinds of things and you've

35:32

insulted can be nice devil piece of. Art.

35:34

Their doesn't have to be all that important, I just need

35:36

something to add a little visual splash. It's just great for

35:39

that kind of stuff. And. If they

35:41

have a bunch of things on your to of course like

35:43

one I did not get a chance to play with. I

35:45

have it installed. Is. A text

35:47

Jen. Which. As you can probably guess

35:49

is a web you i for large language models to spit

35:51

out. Different stuff. In it supports a

35:53

bunch of the different models. And

35:55

I have that installed on my system but I just didn't get

35:57

a chance to play of the but again using the Knicks. The

36:00

Ai method. It's so straightforward. amp.

36:03

Because. It's not nick so as dependent you can be on

36:05

a boon to be using it. or you could be on mac

36:07

o s or dub yourself. I. Think that's

36:09

pretty powerful because then you get a consistent experience across

36:11

all you different systems and of course you know it

36:13

is to modifying your system in this case had salary

36:16

models that I good at all directory to keep a

36:18

bunch of stuff and store at settings. But.

36:21

You. Know it's it's not. A scene

36:24

That case didn't have root permissions running as regular

36:26

user so you know me to make a whole

36:28

separate user run it as if you'd like and

36:30

then everything else. The stuff you bill via next.

36:32

That's all just in the next door. and when

36:34

you go to Clegg garbage when you're done. He

36:36

didn't like that Molly wanted something else. He it's

36:38

going to deploy it another way. You've cleaned up,

36:40

it's cleaned up. I. Think our opinion

36:43

is is that it's easier to use.

36:45

You can read reproducible builds. And.

36:47

You have better organization of is really complex

36:49

projects we can have that can have a

36:51

lot of pieces. I've.

36:54

Put a good question on. What's.

36:56

Reasonable hardware to expect the stuff to even

36:58

be useless because I know you know some

37:00

of us are running and on fancy hardware

37:02

bus. Can anybody just run this on their

37:05

hundred other work laptop and given at least

37:07

a try? I mean obviously the performance is

37:09

gonna be different but if. How

37:11

painful west would you say? The imaginary some was

37:13

on the Cps. I'm decently decently

37:16

panel yeah, I mean I'm only

37:18

like four or is minutes yet.

37:20

For. Some enemies are you could be a lot quicker.

37:24

So. Depends on your how impatient you are.

37:27

if you have patience frighten you know and

37:29

it is nice like it gives you the

37:31

little he can he can said it's little

37:33

show you the in progress sort of pictures

37:35

as the model is refining and defusing and

37:37

in the command line or in your lungs.

37:39

It also have little progress bar. A

37:41

So at least get some pretty decent feedback, especially compared

37:43

to some of the commercial systems. In terms of like

37:45

how long am I going to be waiting. But.

37:48

You are probably isn't. i

37:51

think ill is let you start with a big invoke a

37:53

i is a fun one if you're listening to this new

37:55

thinking i do want to try some of the local stuff

37:57

i do want to try to help these projects grow I

38:00

would like to understand this a little better. If

38:02

you get invoke AI deployed, the

38:05

building blocks you use to do that will work for all

38:07

these other ones as well. There'll be little individual tweaks you

38:09

might make if it's depending on what the software is, but

38:11

like the method and the approach and the documentation is all

38:13

gonna be the same at that point. So

38:16

it's a journey, but once you

38:18

learn it, you'll have that path well-treaded. And

38:21

I think it's, I mean, I think it's like, we're

38:23

just gonna leave it, we're just gonna keep it. I mean, it's

38:25

gonna keep updating now. It's like, we're gonna be current with the

38:27

projects as they release stuff, and it's great for us. We just

38:29

set it and forget it. collide.com/unplugged.

38:34

Well, you've probably heard us

38:36

talk about Collide before, but

38:39

did you hear that Collide was just acquired by

38:41

OnePassword? It's pretty big news since these

38:43

two companies are leading the industry in

38:45

creating security solutions that put users first.

38:48

For over a year, Collide Device Trust has helped

38:50

companies with Okta ensure that

38:53

only known secure devices can access their

38:55

data. And well, it's still what

38:57

they're doing, but now they're doing as part of OnePassword. So

38:59

if you've got Okta and you've been meaning to

39:02

check out Collide, now's a great time. Collide

39:04

comes with a library of pre-built device posture checks,

39:06

and you can write your own custom checks for

39:08

just about anything you can think of. Plus

39:11

you can use Collide on devices

39:13

without MDM, like your Linux fleet,

39:15

contractor devices, and every BYOD

39:17

phone and laptop in your company. Now

39:21

that Collide is part of OnePassword, they're

39:23

only getting better. So go check out

39:25

collide.com/unplugged to learn more and watch

39:27

their demo. It's kolide.com/unplugged. Watch

39:30

the demo, support the

39:32

show. collide.com/

39:35

unplugged. Well,

39:40

conference season is coming up quick, and we

39:42

have some updates specifically around Texas Linux Fest,

39:44

which is happening April 12th to 13th. Chris,

39:46

can you catch us up on those updates?

39:49

Yes, well, mainly the time slots and the schedule are now

39:51

up on the website. So if you've been trying to look

39:53

at what's going on, the talk titles and the speakers have

39:56

been filled in as we have them, and more will be

39:58

coming up there as they get confirmed. And

40:00

it sounds like there's at

40:03

least some planning in the works for a party

40:05

Friday night. Oh. Very

40:07

potentially and there may be others. They tend to be a little more

40:09

– I just planned it the last

40:11

second. And I think the good news

40:13

for anybody that's considering going is we do have a

40:15

hotel group rate link. So

40:18

I guess normally like there's like this

40:20

hotel is like 320-ish a night and

40:23

Texas Linux Fest group rate is going to be 250

40:26

a night. So it's not bad and you're right there.

40:28

You don't have to go far. So that's good. And

40:30

again that's April 12th and the 13th and then just

40:32

shortly after that, Linux Fest Northwest 2024

40:34

is the 26th and 28th of April. And

40:39

the update here is the schedule is posted.

40:42

You'll see some familiar names on there. I

40:44

believe there's a Wes Paine on there. And

40:47

if you go to Wes's talk on Sunday at Linux Fest,

40:49

just stick around in that room because then we're going to

40:51

do Linux Unplugged after that right there. It's going to be

40:53

a whole day of fun. Yep. So you can just hang

40:55

out. Pretty great. Shoutout

40:58

to the core contributors. They are participating

41:00

in a membership program that finances the show

41:02

directly by the listeners and gives them access

41:04

to additional content. There's an ad-free version of

41:06

the show but I think

41:08

the real winner is the bootleg version

41:12

because the pre-shows have just been bangers. Some

41:14

spicy pre-shows recently. Some

41:16

spicy couple of pre-shows in there. Maybe it might

41:18

be worth checking out just for that. You do

41:20

get a little of the secret sauce. And

41:26

now it is time for the boost. Thank

41:29

you everybody who boosted in this week. We

41:31

got a whole big batch of Fountain FM

41:33

feedback and I – well, with Wes's help

41:36

bundled it all up and

41:38

sat down with Nick and we went over all of it.

41:40

And if you've been boosting from Fountain, I think nearly all

41:42

of them Nick has been replying to directly. So go check

41:44

your Fountain FM app. So

41:46

I'm not putting all of it in the show

41:49

because some of it is getting handled directly but

41:51

I will put some of it that I have answers for. And

41:53

Our baller boost came in from user 38 who sent in 51,000

41:55

Sats. Id

42:03

right my original donation was attend be a

42:05

hundred and fifty thousand and but I had

42:07

some problems with Ah Sound and maybe they

42:09

could fix the custom don't his in feature

42:11

which was. Broken. Yeah.

42:14

Yeah. Really? come on. The fact that he still

42:16

managed even with that bug where he raped

42:18

we appreciate and has become our by this

42:20

week. I. Read sarcasm comes

42:22

in with forty three thousand sets.

42:24

I thought that which will kind

42:27

costs. I. Would like fountain

42:29

to allow me to build playlist

42:31

from subscribed so it's not just

42:34

individual show episodes. I my podcast

42:36

in groups in like tech news,

42:38

dad, staff, etc. And. To

42:40

can be bothered to peruse show episodes individually

42:43

themselves. I wanna know what dad stuff he's

42:45

listening to? I don't have any dad stuff.

42:47

Pike as. I would you boost in

42:49

and tell me what you're listening to the gummy recommendation

42:52

So I wanted to put this one in the show

42:54

because I talk directly to Nyc from Fountain about all

42:56

of these and this when he says is doable to

42:58

that. Are You can tag the episodes

43:00

and then you can build a playlist based on the

43:02

tags So that is how you solve that problem. Sarcasm.

43:05

It is available to you. Already.

43:07

And I wanted you to know them. Thank

43:09

you for the boost in clearly boosted in

43:12

a mega ruff. Ducks these El Lugar not

43:14

foothold, not ducks. For. The docker

43:16

compose vs. area in discussion. I ended

43:18

up on the answerable docker Compose France

43:21

pole vault I didn't understand secrets on

43:23

doing it with next can fake. I

43:25

get that the consider says of what

43:28

file you need but doesn't store the

43:30

content or secrets. My missing something. We've.

43:32

Done this we've had, we've had different could fix

43:34

where we refer to signify what do we do

43:36

a different what? what's of mean secret the next

43:38

as a whole topic. So that depends on the

43:40

approach that year. Wanna do not? There's the simple

43:42

or maybe it's not simple depends on what you

43:45

think. Okay but there's the approach described here that

43:47

we've done that we didn't or example next glad

43:49

set up where yes manage secret separately, whatever you'd

43:51

like you could use and vault the could use

43:53

which we did linked to so you may be

43:55

referred to that. I'm and that just means

43:57

you need a separate process with an ascent secrets up where

43:59

you want them and. and the Nix stuff

44:01

is just set up to refer to those and expects them

44:03

to exist. Or you can

44:05

deploy stuff in Nix. There's

44:07

a few different options. I mean, you could just embed them in there if

44:09

you weren't worried about that. But you do have to be careful if you're

44:12

worried about security or you're on multi-use machines

44:14

because depending on what method you use, you

44:16

probably don't want your plain-text secrets in the

44:18

Nix store because anyone can just go and

44:21

read that. But two popular

44:23

ones, there's Age Nix as

44:25

well as SOPsNix, S-O-P-S, and then there's

44:27

a bunch of other projects as well.

44:30

So depending on the approach, the complexity, what your actual

44:32

needs in terms of deployment are, there

44:34

are a bunch of ways to do it in Nix, but that's

44:37

not something that we built in or planned around

44:39

our solution. And of course, Ansible and Ansible Vault

44:41

is a time-honored way to do it. Yeah, I

44:43

mean, if it works? Sounds like a

44:45

nice setup. Thank you, Ian. Appreciate the boost.

44:48

Sir lurks a lot. Comes in with 2,048 staff. Hey,

44:51

sir, nice to see you. And I like hearing your

44:54

experience with VR headsets like the Quest Chris, but

44:56

I wonder how well they will work for people

44:58

with poor eyesight. Do you wear glasses

45:00

inside the headset or does it have optics to adjust

45:02

to focus for each eye? I

45:04

don't think it has individual eye focus, but

45:07

you could wear glasses. I know some people

45:09

do. For me, I'm, as I age, becoming

45:11

nearsighted. And so since the lenses are

45:13

just like right in front of your eyeballs, for me,

45:15

it's fantastic. I, in a

45:17

MERST, also realized that you can adjust some of

45:19

the encoding quality and things. So I didn't actually

45:22

have everything turned all the way up to like

45:24

its pristine quality. Looking even better

45:26

than it was for last week. I

45:29

used it for prep on this show. And one of the things

45:31

that I've also done now with my MERST setup, which I can't

45:34

believe I didn't think of this sooner, is

45:36

I've replicated the monitors like I have

45:38

them here at the studio. So I

45:40

have a vertical screen here at the studio and

45:43

I now have that same on that same position. It's a

45:45

vertical screen and I have the same horizontal screens. And

45:48

so it's very much the same workflow for me in

45:51

MERST or in the physical world. And I'm liking that

45:53

a lot. VT52 boosts in

45:55

with a roeduct. Uh, reusing nixos.org. search

46:00

for NixOS options, little

46:02

pro tip, try searching

46:04

over at mynixos.com. It

46:07

searches across options and packages

46:09

at the same time, so

46:12

you'll find e.g. Nix packages slash

46:14

packaged slash tmux and

46:17

Nix packages slash options slash

46:19

program slash tmux in one

46:21

search. That's cool. The UX

46:23

is a little nicer too. You

46:25

can browse among the options hierarchically instead of

46:27

one flat list. Bonus, it

46:29

also searches Nix packages and home manager.

46:32

Yeah, this is a great one. I kind of forgot about

46:34

it, but I have run into it a few times, you

46:37

know, just when trying to search for Nix options and not

46:39

already at the main search site. It's

46:41

great to have other options. Thank you, Vt. mynixos.com.

46:43

That is great. Thanks,

46:45

Vt. I really appreciate that. I

46:48

have a little pro tip to add to this

46:50

that I've implemented this week. So I've been searching

46:52

the Nix packages through

46:55

DuckDuckGo Bang implementation. So

46:57

you bang Nix packages and then you just search for what

47:00

you're looking for and it sends you directly to the

47:02

Nix package search. But I've

47:04

one upped that this week on

47:07

KD Plasma and I'm using the

47:09

plasma launcher, which has the same

47:11

functionality if you look deep enough

47:13

in the settings. And I'm going

47:16

searching directly from my desktop.

47:18

It's amazing. Yeah, nice. Yeah,

47:21

I need to use krunner more. It is the

47:23

best runner up there. I feel like it's gotten a little pokey.

47:26

I'm hoping with plasma six, which comes out in a few days.

47:28

I'm hoping to fall in love with

47:30

krunner all over again. Well, Gene Bean boosted

47:32

in with a row of ducks. Regarding

47:36

Fountain, it's missing the dynamic playlists

47:38

that are used in Casamatic and

47:40

Overcast or the show stopper for

47:42

me. For example, this

47:44

show is in a playlist called Linux

47:46

and Tech that contains the shows I

47:48

subscribe to that fit that particular topic.

47:51

The playlist automatically shows me the unplayed

47:53

episodes for these collections of shows. This

47:55

allows me to see what I've heard

47:57

and haven't heard yet in a given

47:59

topic. Tags, labels, and manual

48:01

lists just don't cut it for me. How

48:05

many people are doing this where you're

48:07

like you're play listing out the

48:09

topic, like Linux and tech for Gene being here? I

48:12

just always have and I'm curious about you guys. I

48:15

always just have like my main list of chronologically

48:18

released podcasts. And I just kind of scroll through that and

48:20

pick what I'm going to listen to and I don't ever

48:22

create a playlist or anything like that. I think

48:24

at most I might create like a per session,

48:26

like I'm going for a drive and I want

48:28

to keep up a couple, something like that. I

48:30

do have road trip playlists. What about

48:32

you Brent? Do you just stack your list or

48:34

do you have playlists? Do you go all organized?

48:37

I think I would love this because typically when

48:39

I'm reaching for my podcast players because I'm in

48:41

the mood to listen to a certain type of

48:43

content. And so I end

48:45

up, Chris, like you browsing through the list,

48:48

but I find that more to be friction than inspiration

48:50

most of the time for me. So I think I

48:52

would love this feature. Lazy locks comes in

48:54

with 10,000 sets. It's

48:57

over 9000. All

49:00

your talk of virtual displays has got me

49:02

wanting to check out links in the show

49:04

notes. That visor looks promising. I hope it

49:06

delivers. Looking forward to seeing you guys at

49:08

scale. Yeah, Lazy Lock. Heck yeah. So

49:11

visor.com, the deal I think is over

49:13

now, but it's like 400 bucks

49:15

for VR headsets that are just

49:18

designed for working. Really

49:20

high quality displays that look like sunglasses.

49:23

So you're not wearing big old ski goggles. I

49:25

think people like that. Master reboot comes in with

49:27

9000 sets. Just

49:30

following up on a telegram message about the

49:32

comparison between Quest 2 and the Quest 3.

49:36

If I look dead center, everything seems fine

49:38

on my Quest 2. But if

49:40

I stray just a little, things begin to

49:42

look fussy. Can you tell me

49:44

if the Quest 3 is like this as well? I

49:48

think it's my first VR, so I don't really have anything to compare it to.

49:50

That's a great question because you

49:52

would think this wouldn't really be viable if

49:54

you looked at this through the Quest 2. I

49:57

tried the Quest 2. I saw what you're seeing. I tried it this

49:59

weekend with my daughters. It's one thing for a

50:01

game. It's another if you're trying to you're working in

50:03

the terminal You're composing an email and you're reading text

50:05

for hours huge difference. Yeah. Yeah with a with a

50:07

3d game You don't really care. But when you're trying

50:09

to read text on a monitor big

50:12

difference. So on the quest 3 You

50:14

have a much much better field of vision.

50:17

I'd noticed because I had I had

50:19

probably I mean I my

50:21

screens were at my I zoom my screens to

50:24

test this for you at the maximum width of

50:26

my peripheral vision and At

50:28

the edges of my peripheral vision the monitor

50:30

does kind of have like a little kind

50:32

of a blurry effect to it at the very very Edges, but

50:35

I'm talking like if you held your hands to the edge of where

50:37

you can see them That's where it starts getting blurry yet But

50:40

then the field pretty much from in there to the

50:42

center is looking really sharp And if you use immersed

50:44

make sure you mess around with the encoding options and

50:46

that also makes the text look better I've

50:49

learned so got it. You got it the

50:51

defaults are Optimized for Wi-Fi

50:53

performance and you know, they assume your Wi-Fi is

50:55

crap But if you got good Wi-Fi

50:57

or you hook it up over USB-C, you can step

50:59

up the quality and the text looks even better Bear

51:02

4 5 4 boosted in 5,000 sats Well,

51:06

hey there bear have tried found a few times

51:08

But just keep going back to pocket casts and

51:10

there are a few reasons for that Number

51:12

one the home screen makes it feel a

51:15

lot more like a social media platform than

51:17

a podcast player Let's take these one by

51:19

one. So the home screen making

51:21

it feel like a social media app than a

51:23

podcast player I agreed with two and was the

51:25

biggest friction point I had with fountain initially then

51:28

I discovered like six new podcasts I never

51:30

would have found ever and they're my regular

51:32

listens now and I've changed my tune a

51:34

little bit Then I also I

51:37

started following a couple of other JB listeners

51:39

and they started following me That's

51:41

fun. And now I'm seeing their comments and

51:43

what they're listening to and they see my

51:45

clips and Now it

51:47

makes more sense. But bear I definitely had to

51:49

take I don't know man three

51:51

weeks of just being like I really

51:53

wish you would just open my library. I really wish and now

51:56

when I launch Fountain I

51:58

actually stay on the home screen because I know in

52:00

the library screen it's just downloading my podcast anyway.

52:02

So while it's downloading my podcast, I just scroll

52:04

through the home screen, see what other listeners have

52:06

been boosting or listening

52:09

to. So I know it's not

52:11

exactly what you want to hear, but you might just

52:13

give it a reframe of thought. Bear says number two,

52:15

I find it really hard to see new unplayed podcasts.

52:17

Even when I play an episode, it stays

52:20

on my episodes list. I

52:22

would appreciate an option just to see new

52:25

unplayed episodes. Yeah, or auto delete maybe when

52:27

you finish an episode. So

52:29

they've added swipe to delete that makes that a little easier.

52:31

But yeah, I think right now by default, unplayed

52:33

are still chronologically listed. They just are now

52:35

marked as played. And

52:38

Bear's number three, my biggest gripe is I

52:40

really would like an option to make a

52:42

good use of bigger text. The text in

52:44

the app is just small for me. Right.

52:49

I mean, I wonder, I know on iOS, you

52:51

can pop up the zoom up. I

52:53

don't do that on my paisel, but maybe there's

52:56

an accessibility option there. I'll chat with them about that

52:58

particular bear. I know you had some albi issues too.

53:00

If you hit me up on matrix, I can

53:02

help you troubleshoot that. That's

53:04

good feedback though. And those are things we will be talking

53:06

about in our next sit down next Thursday. Pressly

53:09

weep HD. I'm

53:12

going with it comes in with 2669 sats from

53:14

fountain. And he says I

53:20

got a math problem for you guys. It's

53:22

the sats times

53:24

17 equals my zip code.

53:26

So he's doing a little math game with us. Did

53:28

you bring that map? Oh, you did. This is why

53:30

we're weeping in HD. I make sense now. Yeah, 2669

53:32

times 17. That would be 45373. And

53:34

that seems to

53:43

be a postal code in Miami County, Ohio.

53:46

Oh, cities

53:48

like Troy, alchemy or Staunton.

53:51

So they got a county called Miami. And

53:54

they got a city called Troy. Great.

53:56

This is so great. I mean, you just like where do

53:58

you live? I live in Troy. in

54:01

Miami. You know, Ohio. Thanks,

54:04

Weep. Appreciate

54:07

that. Boasts. Difficulty

54:10

adjustment, Boosin, with ten thousand cents. It's over nine

54:12

thousand! Love your

54:14

work, gentlemen. Um, a

54:17

video option on Fountain would be a game changer.

54:19

Little Burry tells me that may be a possibility.

54:21

I think a lot of the plumbing is there,

54:23

and a lot of the Underline libraries support it.

54:25

We may get there with our lid support for our scale trip.

54:27

We shall see. So

54:30

that'd be taking advantage of alternative enclosure set-ups? Yeah, you

54:32

got it. So Podcasting TotoSpec has the

54:34

alternative enclosure, and in lit too you can do, here's

54:38

my primary stream, and here's an alternative stream. And you basically

54:40

put like HLS or RTMP URL in there. And

54:43

then the client player would just grab that and

54:45

figure it out. So it could happen difficulty adjustment.

54:48

I like that username too. Barnminer comes

54:50

in with five thousand, five hundred and fifty-five sats. He says, tell

54:52

Fountain I switched to Breeze Wallet for Boos since

54:54

it's self-custody. Actually did. And

54:58

Nick's response was yes, that is on the list. It's

55:00

been on the list for a while. It's still on the list. Well

55:04

network Rob sent in a Spaceballs boost.

55:06

So the combination is one, two, three, four, five.

55:10

That's the stupidest combination I've ever heard

55:12

in my life. So

55:14

I'm boosting from Fountain today for a couple reasons. One, giving a

55:16

little extra value back to my stream sats. And

55:20

also number two, scale. I

55:22

know I'm not a fan of that. Number

55:24

two, scale. I know it's a little, but

55:27

I know every little bit helps. Number three,

55:29

keep up the great work. Love the value

55:31

to value. And number four, well since I'm behind

55:33

on episodes, giving a small hello to future Rob

55:36

when I get caught up. Hey,

55:39

thanks for joining me for the next episode of the

55:41

Hated in Pasadena. Thank

55:43

you very much network Rob. Appreciate

55:45

that catch up boost. We look

55:47

forward to hearing from you when you do get caught up. So

55:50

excited about scale in NixCon. So, so

55:52

glad we're going. I feel like we don't

55:54

give enough attention to the other events that are going on, but this

55:56

is the big one that kicks off the year, right? And we only

55:58

have so much to pay for that. in our

56:00

tiny little brains. That is probably really what the

56:02

problem is. Yeah. Ben the tech

56:04

guy comes in with Rodeux. It

56:06

says, theme link is already

56:09

in the meta store. I mentioned I side loaded it.

56:11

And it does support proper VR, but

56:13

only on Windows and Mac OS. Fun,

56:16

uh, no, no, no. They're actively working

56:18

on Linux support. But in the meantime,

56:20

ALVR works pretty well. Okay. I

56:22

didn't even look for it in the meta store. I just

56:24

got, so in the workflow, side loading APKs, I

56:26

was like, I'll just go get it. I don't

56:28

need no store. I don't need no store. This

56:31

is a site of former iOS user folks. What?

56:34

No, don't you think I would have defaulted to the store? I

56:36

mean, you're so excited about the ability to side load at all.

56:38

Yeah, you might be onto something there. Mr.

56:41

Pibb that boosts in with 12,345 cents. Yes,

56:46

that's amazing. I've got the same combination on my

56:48

luggage. My issue with fountain is, I

56:51

use an auxiliary cable to listen to my car and

56:54

the sound quality always gets

56:56

distorted at the necessary volume.

56:58

I use the iPhone app and I'm always

57:01

fiddling with phone volume and car stereo volume

57:03

trying to get it loud enough without distortion.

57:06

I have the same problem on Cast-O-Matic. I

57:08

enjoy the JB shows for the professional sounding

57:10

audio quality and it's great with my Bluetooth

57:12

headphones or a speaker, but just not with

57:14

the auxiliary cable in my car. But

57:18

if I use Spotify or Apple through the AUX,

57:21

it sounds just fine. Now what could

57:23

be causing that? Distortion when you're using

57:25

certain apps, but not

57:27

when you're using Spotify or the Apple apps.

57:29

Are these other apps normalizing, boosting the audio

57:31

to a level where there is some distortion

57:33

at those higher levels, but you just don't

57:36

have to get to the higher levels with

57:38

these other apps? So it's something like

57:40

a louderizing sort of. And it

57:42

only happens when you're using an AUX cable.

57:45

That's interesting. Yeah, if anybody has any

57:47

ideas on this one, let us know.

57:49

Also, remember we're collecting your boost ideas

57:51

for AI resources and tools that we should

57:53

be looking at for the show. So please

57:55

Boost those in. Thank You everybody who did boost in.

57:58

If we didn't read your fountain FM Feedback. The

58:00

word nick and I covered it and we're stuck like in all

58:02

of them in check your profile is probably replied to you directly.

58:05

We. Had twenty four boosters come in and we

58:07

stacked two hundred and twenty Two thousand. Two

58:09

hundred and nineteen sets. Almost a road ducks.

58:11

a big old row a mic ducks so

58:13

I'm I give it to as sank you

58:15

kindly and when I caught a big old

58:17

make time. For. Hit Everybody who boasts

58:19

in is. A. Great time to get a

58:21

new podcasting out because. We. Are working

58:23

on new features. We've. Already

58:25

rolled out to our members. they're loving it's.

58:28

And. They're going to be coming out to the

58:30

main feed very soon and some of them were

58:32

gonna start as we head off to Scale and

58:34

we'd like you to be there with us and

58:37

keep podcasting weird and decentralized podcast apps.com. pod.

58:39

Verse and fountain F M and Cast America really the same.

58:41

It's of the show. You. Pick the one

58:43

that works best for you and and boost and with

58:45

a message and you can support us directly that way.

58:47

No middleman, all using an open source peer to peer

58:49

network. And. They get everybody who stream sats

58:51

to us as well. with preset. You

58:53

and a big shoutout to our members out there. For.

58:56

Using their Fiat's on coupons to keep

58:58

us going on the rags every single

59:00

month, we really appreciate them to. You

59:04

mentioned that a lot of these large language

59:06

models and whatnot are coming out with a

59:08

paper. And. Oh yeah, I

59:10

think one of the best ways. To. Just

59:12

wrap your head. Around.

59:15

What models have come out recently and what problem they're

59:17

trying to solve, and if you even need to pay

59:19

attention and any of that. Is. Probably

59:22

the daily papers over on

59:24

hugging face.com. And. Every

59:26

day. since it is a free day I'd

59:28

pretty why I'm in every week decks. They're

59:30

posting just one white paper after another and

59:32

it's an honor know where? It's a fantastic

59:34

resource for just kind of watching the development

59:37

of these. You must have taken a look

59:39

at this earlier. It's a real I mean

59:41

hugging face.co itself is a resource I I'm

59:43

gonna. I'm gonna assume most people listening to

59:45

this. Already. Know about

59:47

it but as a higher level

59:49

recommendation Hugging face.co. Is. your

59:51

into any of this kind of stuff for the community

59:54

around these open models and datasets and yeah they have

59:56

sponsors but bunch of stuff like that and then they

59:58

have these daily papers And that's

1:00:00

at huggingface.co. papers. And

1:00:03

these are the white papers that accompany the announcement

1:00:05

of these large language models or

1:00:07

whatever they're working on. And I

1:00:09

think just you can look at

1:00:11

the headline, you can look at the summary and know if it's something

1:00:13

that interests you or not and pretty quickly stay on top of all

1:00:15

this stuff. I love resources like

1:00:17

this. If you have any out there, let us know. Yeah. And

1:00:20

you maybe take a look and see just how fast it goes

1:00:22

from paper to something you can actually use. Yeah.

1:00:26

Yeah. I would say if you quit

1:00:28

Reddit recently, you can go here. Even if you don't

1:00:30

want to dive into the papers, just looking at the

1:00:32

names of these projects is such a treat. So just

1:00:34

scroll through the names. I'm enjoying

1:00:36

that. Yeah. I feel like

1:00:38

this is our moment in history as a community to

1:00:41

keep this stuff open and

1:00:43

to keep the open store stuff viable, to keep

1:00:45

the community vibrant. You

1:00:47

guys probably saw that on the back of

1:00:49

all this Gemini news, Google also announced though

1:00:51

a massive, was it like $60 billion

1:00:53

deal or something like that with Reddit to

1:00:56

pull in all the registered to train AI? Mm-hmm.

1:01:00

That cost and

1:01:03

those kinds of deals, those

1:01:05

are going to always be relegated

1:01:07

to the absolute top, right? The big tech companies

1:01:09

that can afford to write $60 bill or $30

1:01:11

bill or $10 bill or whatever it

1:01:13

is, $30 bill. Yeah. I mean,

1:01:15

what, JB tried to buy the Reddit data, but they weren't interested. Yeah. It

1:01:18

turns out they wanted more than a row of ducks. But

1:01:22

like you look at things like Hugging Face, you

1:01:24

look at the invoke AI, you put it all

1:01:26

together and we saw a very, very

1:01:28

viable shot here. And I think ultimately if I

1:01:30

could be a dreamer long-term,

1:01:34

I'd love to hear stories of listeners

1:01:36

that are implementing this stuff in small

1:01:38

businesses or in their workplace, maybe even

1:01:40

large businesses, because that's where

1:01:42

I fear the most that we're going to get locked

1:01:45

into corporate AI. And it's going to be this risk

1:01:48

averse, not very useful,

1:01:51

overly talkative, watered down

1:01:53

crap tool. And

1:01:55

the reality is, and the BSD folks

1:01:57

have always been right about this, is... You

1:02:00

know, a powerful tool you can shoot yourself in the

1:02:02

foot with. And you can do good

1:02:04

and bad with RM-RF. And

1:02:07

I feel like it's the same kind of general principle with

1:02:09

this stuff. So if you're out there in the field, you're

1:02:11

deploying this somewhere where you've got more than a couple of

1:02:13

users using this stuff, boost a

1:02:15

write-in and tell us how that's going. Yeah,

1:02:17

are you relying on AI-powered stuff for your

1:02:20

business? Would you? Do you feel differently about

1:02:22

it if it's an open-source thing versus something

1:02:24

you can only interface with via API? Yeah,

1:02:27

I sure do. It's

1:02:30

just a wild new frontier. And

1:02:32

we're really just kind of at the beginning. I thought we'd

1:02:34

see how NVIDIA went, and depending on how NVIDIA went, we

1:02:37

knew how much momentum there was in the industry. And NVIDIA

1:02:39

popped to the high end for sure. I think

1:02:41

there's still a lot of legs left in this. But

1:02:44

where it actually ends up ultimately and what kind of

1:02:46

end result we get, that is

1:02:48

not so clear at this point. And we, I

1:02:51

believe, still have a chance to shape that future.

1:02:53

See you next week. Same bad time, same bad

1:02:55

state. We'd

1:02:57

love to have you join us live. If you can get in

1:02:59

that mumble room, which is popping this week. Lots of folks in

1:03:01

there. We always hang out with them, and

1:03:04

they can pop in during the show or before and after the show

1:03:06

is really when we just get to chatting. It's

1:03:08

great to hang out, gives us that live vibe. We've got

1:03:10

details on our website for that. And we do the show

1:03:12

on Sundays at noon Pacific, 3 p.m. Eastern. Links

1:03:15

to what we talked about today,

1:03:17

linuxunplugged.com/551 for the tools and the links

1:03:19

and the resources, all that kind of stuff.

1:03:22

If you thought this show was useful or maybe

1:03:24

somebody else should hear it, please share it. That

1:03:27

is the number one way podcasts get marketed. It's

1:03:29

not really through anything else. It's not going to

1:03:31

be Google Ads, I'll tell you that. So, appreciate

1:03:33

that. Thanks so much for joining us

1:03:35

this week. See you next Sunday. You

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features