Podchaser Logo
Home
Querying OpenStreetMaps via API & Lazy Evaluation in Python

Querying OpenStreetMaps via API & Lazy Evaluation in Python

Released Friday, 10th May 2024
Good episode? Give it some love!
Querying OpenStreetMaps via API & Lazy Evaluation in Python

Querying OpenStreetMaps via API & Lazy Evaluation in Python

Querying OpenStreetMaps via API & Lazy Evaluation in Python

Querying OpenStreetMaps via API & Lazy Evaluation in Python

Friday, 10th May 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Welcome to the Real Python Podcast.

0:02

This. Is episode two hundred for.

0:05

Would you like to get more practice working

0:07

with a P Eyes and Python? How

0:09

about exploring the globe using the

0:12

data from Openstreetmap? Christopher. Trudeau

0:14

is back on the show this week,

0:16

bringing another batch of Pikers Weekly articles

0:18

and projects. We share an

0:21

article from The Pie by blog

0:23

about building queries using the overpass.

0:25

a P I for Openstreetmap. The.

0:28

Post explores the data structures, tags,

0:30

query formats, and how to use

0:32

overpass in Python. Christopher.

0:34

Discussed a real Python article by

0:37

recent Guess Steven group had about

0:39

lazy evaluation in Python. The.

0:41

Peace covers the advantages of

0:43

generator expressions are functions and

0:46

the potential disadvantages of using

0:48

lazy vs. eager evaluation methods.

0:50

We also share several other

0:52

articles and projects for the

0:54

Python community, including a news

0:56

roundup handling control see an

0:59

Ac Geico preventing daily leakage

1:01

and Pandas in Psych It

1:03

Learn discussing the Janko developer

1:05

survey results, asking developers why

1:07

they aren't shipping faster. using.

1:10

You v to install into isolated

1:12

environments and a couple of tools

1:14

for retrying in Python. This.

1:17

Episode is brought to you by

1:19

century. Fix. Applications issues before

1:21

they become downtime with century

1:24

error and performance monitoring for

1:26

Python. All. right? Let's

1:28

get started. The.

1:48

Real Python Podcast is a weekly conversation

1:51

about using Python in the real world.

1:54

My. Name is Christopher Bailey your host. Each.

1:56

week we feature interviews with experts

1:58

in the community and discussions about

2:00

the time articles, and courses found

2:02

at realpython.com. After the podcast,

2:04

join us and learn real-world

2:06

Python skills with a community of

2:08

experts at realpython.com. Hey Christopher, welcome

2:10

back. Hey there. So

2:13

we have a full slate this week.

2:15

We got a few news items, a

2:18

set of articles to cover, and then a couple

2:20

discussion things that we're kind of squishing together, which

2:22

I think will be fun to discuss in

2:25

relation to what's happening in the world of Python.

2:27

And then of course, projects. Who doesn't like

2:30

squishy discussions? Yeah, exactly. All

2:33

right, so you're ready to start it off with

2:35

the news? Sure. First bit

2:38

is PyPy, that's P-Y-P-Y, the

2:40

alternate Python interpreter, has

2:42

released version 7.3.16. This

2:46

one includes Python compatibility for 2.7,

2:49

yes, they still do that, 3.9, and 3.10. So

2:52

if you're into PyPy, go check that out. The

2:55

next piece is PEP 745. This

2:58

is an informational PEP announcing the

3:00

schedule for Python 3.14. Yes,

3:03

that's right. They're already thinking about 3.14. Yeah.

3:07

I doubt I'll be the first to point this out, but that's

3:10

3.14, that's pi, the number, which

3:13

is just gonna be so fun, because

3:15

we'll have PyPy or PyPI to get

3:17

the pi version of Python. So anyways,

3:20

I might be the first to point out the

3:22

obvious, but I'm pretty sure between now and Wednesday,

3:24

October 1st, 2025, which

3:27

is when it comes out, I won't be

3:29

the last. And

3:32

then finally, one little bit of

3:34

not Python, but programming history news,

3:36

the BASIC programming language just turned

3:38

60. BASIC was

3:40

my first programming language. You're

3:43

not gonna catch me waxing nostalgic about

3:45

a lack of functions, required line numbers,

3:47

and a need for GOTO, because yeah,

3:49

I started BASIC before subroutines. I'm

3:52

happy to see that Python's a lot of

3:54

people's first language now, not just because I

3:56

like Python, but because it isn't a toy,

3:59

And that's kind of what. Basic with considered way back in

4:01

the day. How about you? is that where you

4:03

started as well? Yes, And

4:05

Why those people? That. Coded.

4:08

From the back of a magazine a

4:10

little text adventure into. a friend. had

4:12

an Apple two and then I eventually

4:14

got my. Adam computer which is

4:16

an Apple to you and sorry I got to

4:18

play inside basic quite a bit and did lots

4:20

of go to. And

4:23

loops and dancing around to choose your

4:25

own adventure styles that yeah I'd like.

4:27

I I remember both Donkey on know

4:29

the Pc Junior which was hit the

4:31

space bar and move the donkey from

4:33

the left lane to the right lane

4:35

so doesn't get hit by a car

4:37

that code you can actually read and

4:39

understand. and I remember magazine articles for

4:42

the Commodore Sixty Four which essentially was

4:44

just Sprite based. Pete. And

4:46

poke which is like settings set settings and

4:48

removing a said it was just gibberish like

4:50

you typed it in bars like I don't

4:52

know what it's supposed to do. Oh is

4:55

it doing the right thing? I have no

4:57

idea I'd learned no coding from p Could

4:59

poke but yeah you could make pretty sounds

5:01

and make pretty picture. So the most recent

5:04

time I've heard about that was they were

5:06

fixing and voyager and their of have a

5:08

bad ship and I feel about that whole

5:10

of the story. but they were doing something

5:13

similar, moving all that information over and. Repeating

5:16

and poking Cf recipients? it make

5:18

it work across. How many billion

5:20

for a conservative? Reduce Deaths Peter

5:23

Easy. With. Your first

5:25

topic This week I've got an

5:27

article by Jason Brownlee, who writes

5:30

for Superfast Pythons, which focuses on

5:32

concurrent programming, an optimization of your

5:34

favorite language That's Python, not basic

5:37

to. Keep. the thing go

5:39

near the article is called a sink

5:41

io handle control seat now you're probably

5:44

familiar with the fact that hitting control

5:46

see on your keyboard usually stops or

5:48

program what you might not know is

5:51

that this is part of a family

5:53

of signals in unix windows com copies

5:55

some of it but when you hit

5:57

control seen a terminal on unix interrupt

6:00

signal gets sent to your program. This

6:02

signal is often shortened to the name SIGINT,

6:05

not to be confused with signal intelligence for

6:07

anybody who's a big Tom Clancy fan or

6:09

has a military background. So

6:12

your program is actually able to

6:14

trap SIGINT, so the stopping of

6:16

your program only happens if your

6:18

program doesn't trap the signal. This

6:20

is done on purpose so that you can gracefully

6:22

shut your program down if you need to. Say

6:25

you're got to write out a file or need

6:27

to do a database commit, you can trap the

6:29

signal and make sure that those things happen before

6:31

your program actually exits. SIGINT

6:33

isn't the only signal. On Unix there's

6:35

a signal for backgrounding a process and

6:38

a kill signal that isn't trappable so

6:40

that if your signal for trapping signals

6:42

goes awry you can still kill the

6:44

program. Anyway, so

6:46

as you might intuit from the article's title, I

6:49

think IO muddies the signal trapping waters. When

6:51

you've got multiple threads or coroutines running, the

6:53

main process is what gets the interrupt signal

6:55

from the OS. If you do have concurrent

6:58

things going on, that means you now have

7:00

to decide how to handle the signal and

7:02

what to do with all those concurrent activities.

7:05

What the article does is walk you through

7:07

step-by-step as to how to build a custom

7:09

signal handler, where in your

7:11

coroutines to trap it, and how to deal

7:14

with such a graceful shutdown to make sure

7:16

that all the coroutines finish doing what they're

7:18

needed doing, and receive what's called a cancel

7:21

inside of Python to handle all this. So

7:24

if you're doing some async IO, this article

7:26

is definitely worth the read so that you

7:28

don't leave little orphaned processes happening and have

7:30

weird stuff going on in your system just

7:33

because you're doing concurrency. Well,

7:38

my first one is

7:41

about APIs, actually a very specific

7:43

API for open street maps. It's

7:45

called Overpass API and this

7:48

is on the PyBytes blog.

7:50

That's P-Y-B-I-T-E-S.

7:54

I was on their podcast recently and so

7:56

we've been following their blog and they have

7:58

a variety of people contributing. And this

8:00

one is from Judith Valkenrath, and

8:02

it's kind of a tour of

8:04

what this API can do. We

8:07

talk a lot about tools that can create

8:09

an API, fast API, Django, DRF, and Django

8:11

and Ninja, which we talk about a little

8:13

bit more later. But often

8:16

people are interested in consuming the data

8:18

and pulling it out of an API,

8:20

and maybe you're interested

8:23

in exploring that or practicing that. And I think

8:25

this is a really good overview, not only of how

8:27

this particular API works, generally how

8:29

they might be organized, sort of data structures

8:31

they have and so forth. And this is

8:33

a very rich one. There's a lot of

8:35

data, the whole planet, if you will, of

8:38

data. And one of the best

8:40

ways to get familiar with it is to

8:42

sort of just play around with going to

8:44

the website, which is, I think it's just

8:47

openstreetmap.org. As you bring up

8:49

the map, you're looking around, you can right click on any

8:52

particular element or node, and

8:54

you can see the information about it. And

8:56

she has some visuals there kind of indicating

8:58

that stuff. So like she clicked on

9:01

a particular bus stop, and

9:03

there's a bunch of tags that are associated

9:05

with that. So you can kind of see

9:07

the structure there. Does it have

9:09

a bench? Does it have a trash can? Does it

9:11

lit? Does it have a shelter? Stuff like that. And

9:13

then it says things like, oh, it's part of this

9:16

way. And so this connection of

9:18

nodes and how they are connected, the nodes,

9:20

the way and the relations, all

9:23

these sort of tags that are connected there. You

9:25

also kind of get an insight of how this, all

9:27

this information is sort of crowdsourced. You can actually see

9:29

the people that contributed, which I think is really interesting.

9:32

I brought up my own personal neighborhood and there's

9:34

a lot of bike paths,

9:37

walking paths around and somebody

9:40

is really into doing that stuff, which I

9:42

think is kind of cool. I guess they want

9:44

that information for themselves. And so

9:46

they've added all those around the particular area I

9:48

live using, I guess, satellite imagery or other things

9:51

like that to kind of get an idea of

9:53

it. How do you get

9:55

into consuming this? Well, you can play around

9:57

with it directly in the URL if you've

9:59

ever. done that before, you might have seen

10:01

the question mark and build a query string out to

10:03

kind of see the different elements that are in there.

10:06

She describes that process, but also then

10:08

describes using wget and the type of

10:11

output that you would get in that

10:13

particular case. And then it's

10:15

intriguing the list of like how the tags kind of

10:18

connect. And there's a lot of really

10:21

deep information that I wasn't expecting,

10:23

like tourism, amenities, historic

10:25

stuff. One of the fun things that

10:27

she found is a public oven. I'm

10:30

kind of intrigued by that idea. Actually a little

10:32

more intrigued by the idea of listings

10:34

for public bookcases, but this is probably

10:36

Europe and not so much a US

10:38

kind of stuff. So the

10:40

rest of the article digs into using

10:43

this thing called Overpass, which is

10:45

the API sort of connection to

10:47

it, and then a

10:49

Python wrapper for that. I think

10:52

this is a really good resource. She talks about

10:55

using that overpass API

10:57

first with using requests and then

10:59

using the Overpass API

11:01

Python wrapper. And I

11:03

found it a good resource. Like I said, if

11:06

you're into consuming APIs, it's nice to have

11:08

building from, okay, show you what it

11:10

looks like as far as the web, URLs,

11:13

and then manually building on a query and

11:15

then sort of graduating into using Python in

11:17

a library. She

11:19

shares another Python based tutorial that is

11:21

a medium one that actually

11:24

continues on this idea of using the Overpass

11:26

API in Python. So if you want to

11:28

continue on that, you can. And then we

11:31

at RealPython have another, if you are

11:33

looking for even more sort

11:35

of general ideas on how to use

11:37

APIs, it's called, this particular

11:39

tutorial is called Python and APIs, the

11:41

winning combo for reading public data. And

11:43

it has a variety of other types

11:45

of things that you might explore, like

11:47

GitHub, you can kind of practice using

11:49

these public data resources. So I'm just

11:51

check out what's your next one. I've

11:54

got a new article from a

11:56

Steven group on RealPython called what

11:59

lazy evaluation. in Python. And

12:02

it starts out by telling you as programmers

12:04

you're all lazy and you don't work at

12:06

no not at all. Lazy

12:09

evaluation is the idea of not

12:18

generating a value until you actually

12:20

need it. The most common example

12:23

that inside of Python is with

12:25

generators. So if like say

12:27

you've got a process where you want

12:29

a series of values a list gives

12:32

you all those values altogether whereas a

12:34

generator only creates those instances as

12:36

you process them as you go along. And

12:39

that's really where the article digs in.

12:41

It starts out by sort of showing

12:44

a small example of generating a bunch

12:46

of random numbers and the

12:48

difference between say creating a list of those

12:50

versus getting each one of those numbers generated

12:52

as you use them inside a or loop

12:54

or somewhere else where you're using them. There

12:56

are a lot of examples in this article.

12:59

So he starts out talking about built-in

13:02

data types and how they work. The

13:04

built-in functions zip and enumerate and how

13:06

they work. Enumerate is a great

13:08

example. You typically use it around a list

13:11

and you get back pairs. The pair being a

13:13

counter of how far you are through

13:16

the enumeration and then a value from

13:18

it. And it is lazy evaluation. It

13:20

does not reprocess the list. And

13:22

so as you go along you get

13:25

little pairs out and if you just

13:27

instantiate enumerate what you'll get back is

13:29

an enumeration object rather than the set.

13:32

And so you then loop through that and use it

13:34

as you go along. He starts there and

13:36

then digs in a lot

13:38

deeper. There's a lot of stuff.

13:40

There's a library inside of Python in the standard

13:42

library called iter tools. It provides

13:44

a whole bunch of functions that allow

13:47

you to iterate through things and this

13:49

iteration process is typically done in a

13:51

lazy fashion. So the

13:53

first example he uses from iter tools

13:55

is chain which allows you to treat

13:57

two sets of sequences as if they

13:59

were one. sequence and again

14:01

because it fits in the example with the

14:03

article you go through the first piece and

14:05

then it goes through the second piece and

14:07

it's not generating a new thing it's lazily

14:09

iterating through each of those together so you

14:12

don't have to think about them you don't have to squish them

14:14

together and what generally

14:16

the big advantage of this is that

14:18

means you do not have to create

14:20

a duplication there so if you just

14:22

added two lists together now you've got

14:24

to take up all the memory of

14:26

both the first list the second list

14:29

and this third combined list with

14:31

iteration and with lazy evaluation you

14:33

don't need to create that third

14:35

list which saves you a whole

14:37

bunch of memory for a

14:39

large lists this can make a big difference

14:42

the article goes on to show you

14:44

generator expressions and generator functions which are

14:46

things you can write yourself in order

14:49

to build this kind of functionality and

14:51

then does deep dives into all sorts of tools

14:53

that you can use to go along as you

14:56

go there and finally the

14:58

article finishes off showing you sort of

15:00

the advantages versus the disadvantages of lazy

15:02

evaluation I already mentioned sort of the

15:04

advantage there of saving you the memory

15:06

the disadvantages is you can't do things

15:08

like slice into them because you can't

15:10

go immediately to the 50th thing because

15:12

that's not how it works you kind of have to go through in

15:14

sequence so this as with everything

15:16

in programming this is a trade-off and you

15:18

pick the trade-off that works for your situation

15:20

the best Stephen writes great stuff

15:23

he's got a very very approachable article

15:25

style and this one is no different

15:27

it's an easy read so if

15:29

you're interested in this kind of stuff it's a good place to

15:31

start yeah just had him

15:33

on the show that was very fun to talk to

15:35

him and we talked about his book I

15:38

titled the episode about it being like friendly and

15:41

approachable and you know that's definitely his

15:43

style and then what's funny is that

15:45

this I feel is very related to

15:47

the article that just came out this

15:49

week just after the lazy evaluation one

15:51

which is about sequences which kind of

15:53

makes sense and if you've dug

15:56

at all into generators

15:58

I think this is a really good build

16:00

on top of that, like kind of what's happening

16:02

and how they work with this lazy evaluation one.

16:04

So more resources for you to

16:07

dive in deeper here at RealPython. Fix

16:13

application issues before they become downtime

16:15

with Sentry error and performance monitoring

16:18

for Python. One engineering

16:20

leader that used Sentry said the

16:22

time to resolve errors went from

16:24

days to minutes. Whether

16:26

you're taming Python's processing power

16:28

or controlling JavaScript's flow through

16:30

Node.js, back-end infrastructures are

16:32

as complex as they are valuable.

16:35

With Sentry, you can trace issues

16:37

from the front end to the

16:39

back end, detecting slow and broken

16:41

code to fix what's broken faster.

16:44

Within minutes after installing Sentry, software

16:46

teams are able to trace Python

16:48

performance issues back to poor performing

16:50

API calls as well as surface

16:52

all related code errors to fix

16:55

what's broken faster. Installing

16:57

is just five lines of code and

16:59

RealPython listeners get three full months

17:01

of the team plan free with

17:04

the code RealPython on signup.

17:07

That's RealPython with no space.

17:10

Check it out at sentry.io. All

17:17

right, well, my next one is

17:19

a data science topic. It is

17:22

by Kevin Markham and

17:24

his site is dataskool.io.

17:26

This is not something that I've

17:29

personally run into and I wonder

17:31

how much controversy and potential p-snashing

17:33

goes into this idea and

17:35

this concept, but the

17:37

article is titled, How to Prevent Data

17:39

Leakage in Pandas and Scikit-Learn. There's

17:41

a couple concepts being covered. He starts

17:44

with this premise that you can pretend

17:46

you're working on a supervised machine learning problem. Your

17:49

data is in the pandas data frame and you

17:51

discover, oh, we got some missing values here in

17:53

a column And you really

17:55

think that column is important to what you're modeling and

17:58

so you want it to be a feature. Well.

18:00

How should you to? Do This.

18:03

So. There's this. Idea. Of

18:05

including are basically having a tool

18:07

like pandas serve guess values for

18:10

you inside their which I think

18:12

is a really interesting idea. So.

18:15

In this process option one he has his

18:17

are you just fill in the missing guys

18:19

and pandas in shooting the values and then

18:22

you can have continue on. And

18:25

then there's another way that he is

18:27

General. Meaning. For this

18:29

entire article is you should probably

18:31

leave it to a tool that

18:33

is designed. For creating models and

18:35

and doing stuff like that which a psychic

18:38

learn as you ask that original data and

18:40

as I can learn and. Then

18:42

perform all the data transformations including

18:44

the missing value in station with

18:46

like Hitler. I hadn't heard

18:49

the term daily Kids again. I haven't been

18:51

in a situation where I wanted to just

18:53

say still in some values because I truly

18:55

wanted to use a column like this. But.

18:57

That concept. Is. A little

19:00

weird. In. The leakage is not like.

19:02

Data is disappearing is

19:04

that it's crossing over

19:06

into your other. Process.

19:09

Of you start by training. Your

19:12

information but if that same

19:15

included data is over. When.

19:17

You're testing. In

19:19

that data. Also, then you got

19:21

a problem because it's basically. Sort

19:24

of verifying for what you've created

19:26

by the computer data and so.

19:28

I. Found a. Kind of a

19:30

another example that's a little better. In

19:33

the idea of it, this one's actually

19:35

called group leakage, but it it's a

19:37

similar idea. There. Is this example

19:39

on this or and I'll could the Wikipedia

19:42

on it. By. not having i

19:44

of a grouping inside of a

19:46

split column this experiment that was

19:48

looking at x rays to try

19:50

to determine if they hadn't the

19:52

patient had pneumonia they were one

19:54

hundred thousand x rays of thirty

19:56

thousand patients meaning if there were

19:58

about three him just per patient

20:01

by not having the grouping and

20:03

by having this mix

20:05

of patients records

20:08

being both in the training data in

20:10

the testing data, what

20:12

their model did was identify

20:14

people. It

20:16

did not do what it's supposed to do, which is

20:18

predict what pneumonia is supposed to

20:20

look like. So you should have

20:22

kept all those particular patients in

20:25

one group or the other and not have it

20:27

sort of randomly cut across to. So that made

20:29

a little more sense to me in the idea,

20:32

but I still kind of feel a little weird about this idea of

20:34

like coming up with random data.

20:36

So what

20:38

Scikit-learn will do is

20:41

it will actually, in the

20:43

process of creating the split of

20:45

test train split, it

20:47

will actually only create the

20:49

values in the training data

20:52

and then leave the testing part alone. He

20:54

talks about a couple different tools for

20:57

doing that inside Scikit-learn. It's a pretty short

20:59

article. I wonder how much

21:01

of this happens and I wonder if

21:03

this is a

21:06

problem with reproducibility and replication of

21:08

results by imputing values. And I

21:10

would wonder why you wouldn't

21:12

just maybe leave that value out. Yeah,

21:16

I think it comes down to like

21:18

if you think that piece of data

21:20

has strong correlation to the training, but

21:23

then it's missing in 10%

21:25

of the cases, you don't want to throw

21:27

that value out as part of the training

21:30

information, but now you've got to deal with

21:32

those 10% where you don't have it

21:34

for whatever. Your

21:36

choices are guess at it, which

21:40

is really what it is, like fancy guess. There's

21:42

guessing, there's fancy guessing. No, it's called

21:44

imputation. Sorry. And

21:46

there's dropping stuff. And honestly, I think

21:50

this might be a bit of an oversimplification, but

21:52

it kind of reminds me of like

21:54

deciding when to do rounding. Yeah, right.

21:57

So if you round too early or

22:00

round too frequently, you

22:02

can end up with a starkly

22:04

different answer than if you round

22:06

at the end. Right. And

22:08

I think really what my takeaway from this was, particularly

22:11

when you're trying to do machine learning training,

22:13

whether it's imputation or anything else, the

22:15

order of these things can make a difference. The

22:19

machine learning industry is filled with examples of,

22:21

oh, we trained it to recognize there was

22:23

a ruler in the picture rather than what

22:25

was and what wasn't a tumor. Right.

22:28

Exactly. And that's something that you've got to

22:30

be careful with when you operate in this space. Yeah.

22:33

This is sort of a related thing. I'm

22:35

going to be talking to Brett Kennedy about

22:38

his book, Outlier Dimension in Python. And

22:40

I thought maybe he'd be interested in talking a little

22:42

bit about this, at least add a little bit of light to it. He's

22:45

kind of working on the opposite end of the spectrum

22:47

though. In looking for outliers, he's

22:49

trying to find literally the anomalies,

22:52

like financial fraud, bot activity, and

22:54

malicious network attacks and stuff like

22:56

that. It can also be used

22:58

for cleanliness though, because statistics, it's a very

23:00

common practice in statistics to drop the top

23:02

10% and the bottom 10%. So

23:04

that you're dealing with the middle, right? Sure. So

23:07

you don't swing your average in a certain direction or something.

23:10

So again, identifying the outliers is

23:12

kind of the opposite of imputing. It's

23:15

the find the crap that's destroying our

23:17

results. Yeah, it's pulling us off. So

23:20

how do you... And then you decide to chop

23:22

them or do whatever you want to do with them. Yeah.

23:25

I'm not going to say that too, but I think

23:27

that he's actually focusing on actually truly like, I want

23:29

to locate those anomalies because they're actually showing

23:32

me problems and things that we

23:34

want to discover. Yeah. I

23:38

seem to remember there being an example in the

23:40

book too where the outliers could cause data

23:43

grouping to create artificial groups.

23:47

It makes it look like there's two groups

23:49

when there actually isn't Because

23:51

the outliers are pulling your split line

23:53

in the wrong direction. So It's a

23:55

related kind of thing. It's the how

23:57

do we make sure our data is

23:59

helpful. the nurse saw the actual problem

24:01

rather than getting in our way. We

24:03

have real life is messy. Great real

24:05

life. He ate. As always, messy is

24:07

a problem and utterly. Go. Like

24:10

a Texas right into our discussion. We've

24:12

mentioned a couple times are under cover.

24:14

A couple things here. Yep. First one

24:16

is the Janko developer survey result. Yeah,

24:19

yeah. so so we've got two topics.

24:21

I tried to come up with some

24:23

convoluted way of saying they're related and

24:25

decided against it's. About

24:28

the topic at both topics involve developers

24:30

and pretty sure I could say that

24:32

about every discussion we've ever had. so

24:34

famous? Yes, And about surveys of sorts,

24:36

Exactly Sir, Yes, It right So that

24:39

the first topic. Is are the results

24:41

for the Twenty Twenty Three Jangle Developer

24:43

Survey have come out. We've discussed the

24:45

Python Developer survey before and like that

24:47

one. This one is sponsored by Jeff

24:49

Rain, so thanks most folks for doing

24:51

this as some interesting data in here.

24:53

There a couple things that kind of

24:55

stuck out for me personally. First off,

24:57

there's almost a seventy percent adoption rate

24:59

of the latest version, which at the

25:01

time the survey was four Point Two

25:04

sir. Impressive. actually. Yeah, A on one

25:06

hand, I'm come. It was kind of

25:08

surprised and then I stopped and sort

25:10

of thought about it. And honestly, the

25:12

difference between Four Point One, four Point

25:14

Two were minimal, so it was a

25:16

relatively painless progression. I think that makes

25:18

it easier for database adoption be big.

25:20

Winner was Posts Greste. seventy six percent

25:22

of respondents said they were using that,

25:24

followed by Forty Three Four Sq a

25:26

late. Ah, Do note this was

25:28

multi selects on the percentages. don't add up

25:30

to one hundred, don't If you're wondering why

25:32

to be seventy six and forty three? Math

25:34

doesn't quite work there. Yes, I. Was

25:37

a little disappointed to see that

25:39

twenty one percent of folks don't

25:41

do automated testing. I'm hoping that's

25:43

partially because of things like hobby

25:45

sites young my cookbooks site which

25:47

is toppling Saddam Topic Janko does

25:49

not is not testing their So

25:51

spread. The flip side of it

25:53

is my consulting career has told me that

25:56

that's probably overly generous. I'm it's always depressed

25:58

when com a sense of gun. The large

26:00

institution like a banker insurance company to

26:02

find out. oh you're still doing manual

26:04

que a isn't that great. mm the

26:06

other one that was. Bit. Of

26:08

a pleasant surprise was the prevalence

26:10

of Mac O S, so thirty

26:12

six percent of respondents said they'd

26:14

use Linux as thirty two percent

26:16

said Mac O S, and that's

26:19

almost three times the adoption rate

26:21

of Max and the general public.

26:23

They're only about ten percent generally,

26:25

and it's also three times the

26:27

reported usage rate of Windows without

26:29

the linux subsystem. So pure Windows

26:31

without it as a entry point

26:33

to linux. And I suspect

26:35

that's the the same reason Linux is

26:37

out there. that where are the most

26:39

people doing web development? the? they're doing

26:42

Unix on the backend. And.

26:44

Mac is a very pretty version of

26:46

Unix is really when it comes down

26:48

to but tie yeah Was just are

26:50

surprised at the adoption rate them and

26:52

in the last one that was for

26:54

me was said. There were two questions

26:56

that sort of depress me. First, my

26:59

treasured Vim only has a seven percent

27:01

adoption rate and second I'm to Sigma

27:03

to the right of the mean on

27:05

the Aids graph. So from one old

27:07

fogy to another, assess assess assess what

27:09

stood out for you. I like the

27:11

term lovably Vince's have. A.

27:16

Lovely used yes, I'm

27:18

younger than basic assistance

27:20

if it affects or

27:22

yeah, that two things

27:24

that are related to.

27:27

You were talking about testing their one about.

27:30

Continuous. Integration Systems. Which.

27:33

is i'd probably higher value it's like

27:35

only like a quarter of them say

27:37

none and then there's a configuration tools

27:39

which is they're actually pretty low like

27:41

having tools that can do that stuff

27:43

for you automatically set up those ricotta

27:45

extinct i thought the third party framework

27:47

stuff is is interesting to go rest

27:50

framework has dropped a little bit not

27:52

lot but i would guess that may

27:54

be jingle ninjas creeping in there and

27:56

taking away some of that i was

27:58

surprised it didn't show Yeah, it's,

28:00

I think somebody mentioned it and so

28:02

it's, it's, it's growing. I

28:05

had a question about models versus views. I

28:07

know this is like really Django nerdery, but

28:09

I don't know. Is, are you in the,

28:11

with the 69% that says models and the

28:14

favorite core components and maybe don't use

28:16

the view based stuff as

28:18

much? I, yeah, I think

28:21

I probably, I'm trying to remember what

28:23

boxes I checked. I suspect it was

28:25

probably the admin. Yeah. Would,

28:27

would be my guess, which is very

28:30

handy. Probably authentication. I

28:32

can't remember. I, you know, core

28:35

components are core components. I use all, almost

28:37

all those things all the time. So

28:40

I don't know what difference it makes.

28:43

You know, the models for non-Gen and

28:45

Go folks, the model is what sits in front

28:47

of the database. So, uh, if you're not using

28:49

the database, you're probably more likely to use something

28:52

like fast API. So the fact that you're using

28:54

Django kind of tells you that you're there anyways.

28:56

So it's not surprising. And that's part of, you

28:58

know, why I checked the box for admin. It's

29:01

the same thing. It's a built-in tool that allows

29:03

you to muck with the models. And it means

29:05

you don't have to do very much coding to

29:07

get a GUI to interface with the database. Yeah.

29:10

I found it almost

29:12

an odd question. Yeah,

29:15

sure. So, but yeah, the other

29:17

one that I thought was interesting is sort

29:19

of the growth of a HTMX as a

29:21

quote unquote Java framework, um, how a lot

29:24

of these other ones are kind of dipping

29:27

a little bit, uh, react and view and

29:29

so forth and whatever the flavor dues you

29:31

are. I think people are not as interested

29:33

in doing with the

29:36

overhead of having to like have all that

29:38

JavaScript in your head to work with it.

29:40

Yeah. I, before I discovered HTMX myself, I

29:42

was using view.js a lot and

29:45

it wasn't because I really wanted to, it

29:47

was because sometimes your pages need some dynamic

29:49

stuff. Right. And what I really like about

29:51

view is the fact that it allows you

29:53

to, you can adopt varying

29:55

degrees of it. So if you just need a

29:57

tiny little bit of dynamicism on a page. you

30:00

don't have to all or nothing. Whereas,

30:02

you know, the React and Angular of

30:04

the world tend to be, uh, rather

30:06

opinionated in your entire site. It has

30:08

to zow-shelf type script

30:11

rather than, but all

30:13

I needed was an auto-select here and

30:16

so I was, I was very, I tended

30:18

to use view because of that and the

30:20

HTML just makes that so much easier. There's

30:22

just so much less code. And because of

30:24

the way HTML works with partial pages, it

30:27

integrates very nicely with Django because Django can feed

30:29

out partial pages to your heart content.

30:31

Right. So you're basically taking advantage of the

30:33

templating system and it's, it's, it's

30:35

very 1.0 style coding with very

30:38

2.0 style, uh,

30:40

interfaces. So yeah, it doesn't surprise me

30:42

that it's taking off. That's nice. Are

30:45

you, I thought it was interesting. I looked at

30:47

the CSS thing frameworks. I thought that was interesting

30:50

too. That bootstrap is kind of dropping off after

30:52

a long time at the top. I would argue

30:54

it's still at the top, but the, there's something called

30:56

tailwind. Yeah. That's growing. And then there's a few others

30:59

that I've seen inside there, but yeah, bootstraps been around

31:01

for a long time, I would say that you can

31:03

definitely tell the age of a site that's using it

31:05

now. Yes. So yeah,

31:08

cool. It's good to see that. You know,

31:10

I was mentioning to this to

31:12

you before we started that, you know,

31:14

I've found right years, I've been kind

31:17

of paying attention on the side of watching Django and

31:19

having done a little project myself and tried to set

31:21

up some stuff at work and just

31:23

paying some attention to it. And it just

31:25

looks so healthy. Yeah.

31:27

Growing and you know, the Django

31:29

not thing that we mentioned earlier for onboarding

31:32

contributors and just looks good. I, I

31:35

like the batteries included of it in a

31:37

lot of ways. So, you know, of

31:40

course we're not going to get through this

31:42

topic without, you know, the old joke about

31:44

marathoners and how do you know somebody ran

31:46

a marathon? You know, they'll effing tell you

31:48

or went to Harvard. Yeah. Well, so

31:51

it goes the same for authors. So we can't

31:53

get to the bottom of this without me shilling a

31:55

little bit, but, I was

31:57

pleased to have Michael Kennedy.

31:59

of TalkPython acquiesced to

32:01

writing my forward for the

32:04

book. Oh, nice. One of the

32:06

things he mentions in it is

32:08

just how deep the Django community

32:10

is, right? Like outside of most,

32:12

most of the time you see

32:14

language conferences, you very, very seldom

32:16

see library conferences. Yeah. Frameworks. Yeah.

32:19

Or if they are their libraries,

32:21

like they're, they're a track inside

32:23

of conference, whereas, you

32:25

know, DjangoCon is huge, right? Yeah.

32:27

This is European versions and stuff

32:29

out there. There's a North American one, European

32:31

one, and an African one now. Oh,

32:33

wow. And so like there's, and the

32:36

community is very, very vibrant and there's a lot of

32:38

sort of support there. And one of the things he

32:40

kind of said in the, in the forward is sort

32:42

of, you know, welcome to this world. Like it's, uh,

32:45

it really is a very vibrant thing and,

32:47

and people are, they're actively trying to

32:49

figure out how to get more contributors,

32:51

which, you know, you see in very,

32:53

very few places, uh, in the open

32:55

source world often. So I go away,

32:57

go away little boy, you're bothering me.

32:59

Whereas here it does seem to be

33:01

very, uh, an accepting sort of community.

33:03

So, uh, I think that's, I think that's definitely part

33:05

of it. Yeah. Cool. This

33:11

week, I want to shine a spotlight

33:13

on another real Python video course. It's

33:15

connected to one of our discussions this

33:17

week. When you start building any new

33:20

Django web application, there's the

33:22

fundamental step you need to tackle first. The

33:24

course is titled how to set up

33:26

a Django project. And it's

33:28

based on an article by former guest and

33:31

real Python team member, Martin Breus. In

33:33

the video course, Martin takes you through setting

33:36

up a virtual environment, installing

33:38

Django, pinning your project

33:40

dependencies, setting up that

33:42

Django project and starting your first

33:45

Django app. It also

33:47

covers techniques to manage a slimmer structure

33:49

for your Django project. I

33:51

think it's a worthwhile investment of your time

33:53

to learn the fundamental steps to properly set

33:55

up a Django project and to

33:57

get a solid understanding of this structure of

33:59

a Django project. Django application. Like

34:01

most of the video courses on RealPython,

34:03

the course is broken in the easily

34:06

consumable sections. And all RealPython

34:08

courses have a transcript, including closed captions. Check

34:10

out the video course, you can find a

34:12

link in the show notes, or

34:14

you can find it using the enhanced

34:17

search tool on realpython.com. So

34:22

shall we move on to topic two? Yeah, let's

34:24

go ahead. All right, so... A survey. Sort

34:27

of, yeah. Our next

34:29

source of discussion is an article by

34:32

Dachsh Gupta, and it's titled, I asked

34:34

100 devs why they aren't shipping faster.

34:37

Dachsh is the founder of GrepTile, which

34:39

is an AI coding tool. So I

34:41

suspect he talks to devs all the

34:43

time as part of that whole sales

34:45

process. The article name checks a couple

34:48

dozen companies, almost all of which are

34:50

household names, or would be household names

34:52

for houses containing programmers. The

34:55

article goes into detail about these kinds

34:57

of conversations he's had. But I'll start

34:59

out with the two things that surprised

35:01

him. First, nobody had to pause

35:03

for an answer. You ask them, you know, why aren't

35:05

you shipping faster? And everybody immediately went, oh, yeah, off

35:08

the top of my head, here are our reasons. Yeah,

35:11

the reason this is a bit of a surprise

35:13

to him was that there's sort of this aspect

35:15

of, well, if you know that, why wouldn't you

35:17

be fixing it? Well, and as

35:19

we'll get into in a minute, sometimes it's out of our

35:21

control to fix. And then

35:23

the second surprise he had was how

35:26

many of the answers were actually tool

35:28

chain answers. So slow to build, slow

35:30

to compile, slow deploy times. I

35:32

find it interesting because the rest of the

35:35

article spends very little time there. But

35:38

it's definitely one of the topics that comes up.

35:40

And I think it comes up because it's sort

35:42

of the immediate thing that annoys us code junkies.

35:44

So what stood out for you here? Oh,

35:48

it started with some of the stuff that's

35:50

really hard to solve, in my opinion, like dependency bugs,

35:52

like, you know, hitting random mysterious

35:54

bugs with libraries and reading

35:57

into Stack Overflow to try to

35:59

find out what's happening or GitHub issues and so forth.

36:01

And that rolls

36:03

into this next one of a complicated code

36:05

base, you know, where they're not

36:08

updating their documentation and it becomes like a

36:10

vicious cycle of like shipping

36:12

features versus like keeping the docs up to date and

36:14

so forth. I wonder about

36:16

that, like this idea of cross

36:18

languages. We just mentioned, you know, the idea of

36:21

using JavaScript with Django and so forth and how

36:23

that isn't like doubling the

36:25

complexity. It's sometimes like multiplying it

36:28

in a much larger fashion. So,

36:30

but the really interesting stuff is like, as

36:32

you get further and further into

36:35

it, it becomes more about the

36:37

people and less about the

36:39

technology. And

36:41

so, you know, stuff like the QA loops,

36:44

you know, meetings, reviews, weird stuff,

36:47

like developers having too much free

36:49

time and personal vendettas and prejudice.

36:51

And it's just like, I

36:53

have only been in corporate America for a short period of

36:55

time here and there. And so I

36:57

only, I kept saying this to

36:59

people at different jobs that had that were inside

37:02

large buildings that had lots

37:04

of offices where I would say, is

37:07

this normal? And I guess every,

37:10

I dunno, what is the, what is the saying? Every

37:13

family is broken

37:15

in its own way or whatever. I

37:19

don't know the coin as well as I wanted to, but

37:21

you know, and I think that's it. Like, you know, every

37:24

unhappy family is unhappy in its own way.

37:26

And so it just makes me like kind

37:29

of like, I don't have

37:31

a lot of ideas about how you would solve these

37:33

things. Mostly, I think about like, man, I would not

37:36

be interested in them. I thought of some

37:39

things that I've talked to people in the industry

37:42

about at

37:44

least increasing communication and sort

37:46

of making sure the engineering

37:48

teams can potentially work together.

37:50

And that was like the conversations that

37:52

I had with Pablo about Bloomberg. We

37:54

talked about guilds and this idea that

37:56

they have internal guilds. This

38:00

is where a bunch of people that are interested in Python

38:03

can go and learn more about that and so forth. And

38:05

so people came from other teams

38:08

and then that kind of spawned

38:10

conversations across different silos,

38:12

if you will. I thought that was interesting.

38:14

And I don't know if it solves that.

38:17

Well, so one of the reasons guilds

38:19

have become more popular in the last

38:21

decade or so was the

38:23

old waterfall way of doing things was you had the

38:25

development department and they were on a different floor than

38:27

the QA department who were on a different floor from

38:29

the BA department. It's a good form of

38:31

waterfall. And

38:33

the agile space very much tries

38:36

to have research team driven

38:38

pieces. So you get a group of people together

38:40

who are solving the same problem, which means

38:42

inevitably you have some developers, some QA's,

38:44

some BA folks working together. We

38:47

can debate for hours as to what works,

38:49

what doesn't work, but at least then you've

38:51

got a team who are concentrating on that

38:53

problem and solving the problem. The

38:56

downside of that is it becomes a

38:58

lot harder to cross pollinate ideas because

39:00

now all the developers aren't on the

39:02

same floor. So the guild

39:04

concept kind of came out of that

39:06

where you end up sort of lunch

39:08

and learns or whatever where the developer

39:11

folks can all leave their

39:13

feature focused teams and go to

39:15

a process focused area. Some

39:17

organizations will try to put, I don't

39:19

want to call them management and they're not really

39:22

leaders, but sort of architect type

39:24

people who are watching

39:26

what's happening on the different teams and are in

39:28

charge of trying to make sure the best practices

39:30

are being handled so that you

39:32

can get growth across the team. And

39:35

it's a hard problem. So some of

39:37

the things that you have by having those development

39:40

silos, these one set of

39:42

problems doesn't show up, but you get a different

39:44

set of problems and then you get rid of

39:46

those silos and make them feature focused and that

39:48

solves one set of problems and creates a different

39:50

one. Yeah,

39:52

it's always these things are always

39:54

a trade off, right? You know, it's

39:57

take the next thing I'm about to say with a grain of salt because

39:59

of course this is. how I make my money. I

40:01

have found that consulting can help in this.

40:04

And the reason for that is

40:07

kind of counterintuitive. Oftentimes

40:09

what consultants are doing is they're being

40:11

paid an awful lot of money. And

40:13

because of that, they're listened to. So

40:16

I can't tell you how. Yeah, I could

40:18

tell you that how often they listen to

40:20

them and not people that are there. Yeah.

40:22

Yeah. So I can't tell you how often

40:24

my job has been go in, talk

40:27

to everybody, write down what they say

40:29

the problems are and then report those

40:32

problems to management. And you

40:34

know, and it's stupid, right? Because if I wasn't there,

40:37

yeah, if I

40:39

wasn't there, they definitely could have

40:41

still solved the problem. Right. And just by listening

40:43

to their own people. But coming

40:45

in as an outsider and providing a little

40:47

bit of guidance and, you know, massaging a

40:49

little bit of data together, magically

40:52

all of a sudden management listens.

40:54

And so that can actually make

40:57

a bit of a difference. I

40:59

would almost like it if the world

41:01

didn't work that way, but you know,

41:03

it's something you can almost take advantage

41:07

of. I mentioned

41:09

some other things to you kind of before we started

41:11

too, just that I was thinking about, because

41:13

we were going to discuss this. And I feel

41:17

like maybe we're in a cycle, as far

41:20

as, you know, technology, you know,

41:22

the world that we're in. I mean,

41:24

there's definitely this whole AI hype thing, which is kind

41:27

of becoming a little bit much. There's

41:31

a lot of these things that are coming out where

41:34

they sure did ship it, but

41:36

it didn't have the software and it

41:38

wasn't working and all this other sort

41:40

of stuff. And so

41:43

who is pushing so hard to ship

41:46

that stuff that isn't done? And I

41:49

feel like, you know, like they're

41:51

answering to a whole different set of

41:53

people and it's not necessarily like an

41:55

overall company goal. It's like the sales

41:57

people, the advertising people, the marketers and

41:59

just... saying, Oh, well, we'll just fix it later.

42:02

I mean, we see it in so many industries, the video game

42:04

industry, that that's, you know, we're beta

42:07

testing games now, you know, through

42:09

the pre-release cycle and so forth,

42:11

and sometimes that's okay, which

42:13

is interesting to see things get developed later,

42:15

but another time is when it's like a

42:17

physical product and you're paying potentially a monthly

42:19

fee for something that doesn't really work

42:21

is kind of nuts, you

42:23

know? The balancing act is really, really

42:26

difficult. If your technical

42:28

people are too much in

42:30

charge, nothing will ever go out

42:32

the door because they want it perfect. And

42:35

if you're marketing people are too much in charge,

42:38

then the deadline becomes

42:40

about everything and, you know, we

42:43

bought the ads already. Yeah. There's

42:46

definitely an assumption that first

42:48

mover gets an advantage. I'd

42:51

love somebody to actually do a study on that because

42:54

sure. There are some of

42:56

the key, several

42:58

of the letters in fang, most

43:02

of them weren't first movers, Facebook

43:04

was not, Google was not, and

43:07

yet they're dominant, right? I don't know, you

43:09

know, that's anecdotal, right? So I don't know

43:11

how often first mover actually makes a difference,

43:14

but business-wise that often tends to be the

43:16

answer. We have to be first mover. And

43:18

so then you get half baked going out

43:20

the door. The other part

43:22

of it too is, you know, and this

43:24

comes from sort of the organic

43:26

processes that happen in the different

43:28

departments that happen inside of an

43:30

organization, you always hear developers swearing

43:32

about sales because they've promised something

43:35

in order to, you know, Oh

43:37

yeah, of course it does that. And then it doesn't, and then

43:39

development are the ones who are, have to clean it up. Well,

43:44

sales wants to make

43:46

sales and if sales should

43:48

want happy customers, right, because that allows them

43:50

to make more sales. If

43:52

your management is properly thinking about how

43:54

that happens and making sure that the

43:56

salespeople are, you know, one of

43:58

the things that I've. I've worked with

44:00

organizations where the commission doesn't happen unless the cuss

44:03

SAT numbers of a certain size, right? And so

44:05

like you get half your commission when you make

44:07

the sale and you get the other half of

44:09

the commission once it's actually been implemented and your

44:11

cuss SAT score is above 8

44:15

or whatever the number is. And it's

44:18

a completely different dynamic because the salespeople

44:20

are, they want to work with

44:23

the engineers because they're not going to get

44:25

their money unless, right? And

44:28

so there are things management can do to try and

44:30

help this kind of stuff, but

44:32

we have a long history of not

44:34

doing it. And unfortunately the

44:37

vast majority of C-suite

44:39

are not engineers. And

44:42

so the MBAs rule the world.

44:47

Definitely what seems to be happening in a

44:49

lot of Silicon Valley and then

44:52

just the idea of let's

44:55

make sure that we just pay attention

44:57

to the stock

44:59

and so forth. And I feel like

45:01

there was a little bit more of a wall, but

45:04

it's just gone now. It's like, it seems

45:06

like it's completely fallen apart. And

45:08

well, speaking of falling apart, like Boeing's a case

45:11

study in this, right? Oh God. Yeah.

45:14

Yeah. Right. So they

45:16

were an engineering driven company and then the merger happened and

45:19

the MBAs took over and it's

45:21

been downhill since then. So

45:24

I heard about the

45:27

slide just this morning. I don't know

45:29

if you heard that one. No, I

45:31

missed this. There's emergency exit

45:33

slide that got detached in

45:35

flight or whatever. Oh, fine. Another Boeing.

45:39

And then they couldn't find it. And then they

45:42

found it at the one of the lawyers

45:44

who's suing Boeing. At least that's what I

45:46

understand. So yeah, that's pretty bananas. Hey,

45:51

you guys lost this. Anyway,

45:56

I guess it's sort of sobering in a way.

45:59

I wish there was more. solutions

46:02

or ideas in there

46:04

for, I mean, it's,

46:07

you know, first step spot the problem, sure.

46:09

But from there, like, you know, what can you

46:11

do to kind of get beyond it? And I

46:13

don't know if things are so endemic,

46:16

you know, inside of a company that you

46:18

can't quite get past it. I

46:20

don't know. I just feel like we

46:22

are definitely in a cycle right now where a

46:24

lot of this is happening and the push to

46:27

ship things is like so strong. And

46:29

then the push to cut the bottom line is there.

46:32

And like, okay, it's just not really sustainable.

46:35

Well, you know, it's one of

46:37

the things I try to coach

46:39

inside organizations is make

46:42

sure that the tech side understands

46:44

that without the sales side, they

46:47

don't get paid. And make

46:49

sure that the sales side understands that without

46:52

the tech side, you don't have a product

46:54

to sell. So let's start talking about this

46:56

as a problem we're solving together rather than,

46:58

you know, creating these fiefdoms,

47:00

right? Yeah. Anyway,

47:03

shall we move on to projects? Yeah, I think

47:05

digging into projects is probably a good idea. We're

47:09

not going to solve this. So yeah, yeah.

47:11

So mine, it uses two things

47:14

that we talk about a lot. One is

47:16

PipEx, which you've mentioned, maybe even

47:18

last week, I think. And

47:20

the other is UV, which you've mentioned a few

47:22

times, which is a tool that

47:25

in some ways can be for packaging,

47:27

but also can in some ways replace

47:29

what Pip does. So this

47:31

is a project called PipExU. And

47:35

it's installing and running Python applications

47:37

in isolated environments using UV. The

47:39

project's by Mark Blakendy.

47:41

And he goes by Bullet Mark. And

47:43

then the other person that is on the project is Patrick

47:46

Zajja. Each package

47:48

and its dependencies are insulated from

47:51

other applications and from the system

47:53

Python. That's kind of a hallmark of

47:55

PipEx, which is really kind of nice. It's

47:58

a good way to install. tools that you

48:00

use across a

48:03

system that you want to have access,

48:05

but it won't be embedded in other

48:07

particular projects you have. So again,

48:10

if you're familiar using pipx, pipx use just

48:13

kind of a thing that provides a

48:15

speed improvement in a lot of ways.

48:17

It also is lighter weight

48:19

in the sense that pip is one of

48:21

these dependencies that

48:24

does take up some space and also

48:26

is updating itself pretty often. And

48:28

so this is something that on

48:30

the UV side is going to be a little quicker.

48:32

I installed it, played around with

48:34

it. I played around with a couple different tools

48:37

that I like to use and yeah, it just

48:39

worked really, really simply. In some ways it was

48:41

a little more verbose and it's a installation which

48:43

was kind of nice kind of showing me things

48:46

that were in there. I don't use

48:48

pipx all the time. So maybe I would you know could

48:50

turn on a different flag or two and I would have

48:52

seen similar sort of stuff, but I

48:54

like the defaults that it provided. So yeah,

48:56

if you're interested in pipx and you're interested

48:58

in maybe speeding it up a little bit.

49:00

This is taking advantage of

49:02

the enhancements of UV. What's

49:05

your project? I've got two which are

49:07

related to each other. Okay, the first

49:10

one is called Tenacity and it's by

49:12

Julian Dandjou. This library is

49:15

for doing retries in your code. So it

49:17

started out as a fork of a library

49:19

actually called retrying which is no longer maintained.

49:22

So let's say you've got a flaky network

49:24

connection and you want to try sending something

49:26

a few times before giving up. Tenacity

49:29

ships with decorators that you can apply to

49:31

a function that give you this functionality. For

49:34

example, the decorator can catch exceptions and then

49:36

retry the function again. And if the function

49:38

passes then Tenacity does nothing and you can

49:41

go along your merry way. You

49:43

can control how many times to

49:45

retry. You can control how long

49:48

to wait between retries, conditional retries

49:50

based on specific exceptions, or

49:52

you can call a truthy function that decides

49:54

whether or not to retry it. And

49:57

in fact, it also has tools to

49:59

hook in with managing laws. and monitors

50:01

in case you want to give information

50:03

about retrying. So neat little library. And

50:05

then the second project is a wrapper

50:07

to Tenacity and it's called Stamina. It's

50:10

by Heineck Schlawak. We've

50:13

mentioned him on the show before back in

50:15

episode 145. We featured his

50:17

article on the Knox testing tool. The

50:20

easiest way to explain stamina is to

50:22

quote the README. Stamina is

50:24

an opinionated wrapper around the

50:26

great but unopinionated Tenacity package.

50:29

So essentially it tries its best

50:31

to pick reasonable defaults for things

50:33

and to provide some helpers. So

50:35

for example, out of the box

50:38

stamina rather than retrying on all

50:40

exceptions, it only retries on certain

50:42

exceptions. So you know that network

50:44

example I was giving you, IO error is probably a

50:46

common one to do, but if it throws up a

50:48

key error then you wouldn't. Likewise

50:51

it uses exponential back off to wait

50:53

between retries, which is a good philosophy

50:56

and limits the number of retries and the

50:58

total time taken. So it's essentially taken good

51:01

ideas about how to do a retry

51:04

and implemented it using Tenacity. The one

51:06

feature I really like is it also

51:08

has a helper method that allows you

51:11

to globally disable the retry mechanism, which

51:13

can be really really handy when you're

51:15

testing. And the documentation shows

51:17

you how to do this with a

51:19

PyTest mixture. So if you need retry

51:22

stuff, these libraries are great. If

51:24

you're already a Tenacity person, I definitely

51:26

recommend taking a look at stamina because it might

51:28

actually save you some coding. You mean

51:30

tenacious person, right? I

51:34

like the attempts and so forth. You can

51:36

kind of like put values in for that

51:39

and I guess you would in the testing

51:42

runs of these things, you would just maybe lower

51:44

it to one or something like that just to

51:46

make it go through. It has

51:48

a global setter getter kind

51:51

of function in the library and you

51:53

essentially just say turn it on

51:55

turn it off and it's as if the

51:57

decorator is not there. Okay awesome. Well,

52:00

thanks for bringing all these PyCoders

52:03

Weekly articles and projects this

52:05

week and discussion topics. Thanks,

52:08

Christopher. Great. I'll see you in a

52:10

couple weeks. Okay. And

52:15

don't forget, installing Sentry is just

52:17

five lines of code. And real

52:19

Python listeners get three full months

52:21

of the team plan for free

52:24

with the code real Python on

52:26

signup. Check it out

52:28

at sentry.io. I

52:31

want to thank Christopher Trudeau for coming on the show

52:33

again. And I want to thank

52:36

you for listening to the real Python podcast. Make

52:38

sure that you click that follow button in your

52:40

podcast player. And if you see a subscribe button

52:42

somewhere, remember that the real Python podcast

52:45

is free. If you like the show,

52:47

please leave us a review. You can

52:49

find show notes with links to all

52:51

the topics we spoke about inside your

52:53

podcast player or at real python.com/podcast. And

52:55

while you're there, you can leave us

52:58

a question or a topic idea. I've

53:00

been your host, Christopher Bailey, and look forward

53:02

to talking to you soon.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features