WEBVTT

1
00:00:10.080 --> 00:00:13.185
<v Mike Rugnetta>Friends, hello,
and welcome to Never Post, a

2
00:00:13.185 --> 00:00:15.265
podcast for and about the
Internet. I'm your host, Mike

3
00:00:15.265 --> 00:00:20.065
Rugnetta. This intro was written
on Tuesday, 09/23/2025 at

4
00:00:20.065 --> 00:00:25.950
11:56AM eastern, and we have a
slightly late show for you this

5
00:00:25.950 --> 00:00:28.910
week. Sorry about that. I got a
toddler cold, and we're in the

6
00:00:28.910 --> 00:00:32.190
middle of renegotiating AI
Mike's contract, but we're here.

7
00:00:32.590 --> 00:00:37.150
We made it. We did it. In our
third ever show length segment,

8
00:00:37.150 --> 00:00:41.285
contributing producer Tori
Dominguez Peak returns to look

9
00:00:41.285 --> 00:00:44.885
at artificial intelligence in
the classroom, including one

10
00:00:44.885 --> 00:00:50.485
instructor who has said no more
and has banned its use entirely.

11
00:00:50.725 --> 00:00:54.245
Tori tells Jason what that
entails and tackles the old

12
00:00:54.245 --> 00:01:02.100
Kennard is writing, thinking,
and also bop spotter. But right

13
00:01:02.100 --> 00:01:03.540
now, we're gonna take a quick
break.

14
00:01:03.540 --> 00:01:05.780
You're gonna listen to some ads
unless you're on the member

15
00:01:05.780 --> 00:01:08.500
feed. And when we return, we're
gonna talk about a few of the

16
00:01:08.500 --> 00:01:11.855
things that have happened since
the last time you heard from us.

17
00:01:13.055 --> 00:01:16.895
Hello? Is it five stories for
you this week you're looking

18
00:01:16.895 --> 00:01:22.015
for? YouTube is reinstating the
channels of creators previously

19
00:01:22.015 --> 00:01:25.535
suspended for violating COVID
nineteen and election

20
00:01:24.830 --> 00:01:28.190
disinformation guidelines an
alphabet lawyer says are no

21
00:01:28.190 --> 00:01:29.630
longer in force.

22
00:01:29.710 --> 00:01:32.670
This according to a document
obtained by Fox News and

23
00:01:32.670 --> 00:01:35.870
prepared for the US house
judiciary committee. Reflecting

24
00:01:35.870 --> 00:01:38.990
on the company's commitment to
free expression, Daniel f

25
00:01:38.990 --> 00:01:42.375
Donovan, counsel for alphabet
rights, YouTube will provide an

26
00:01:42.375 --> 00:01:45.655
opportunity for all creators to
rejoin the platform if the

27
00:01:45.655 --> 00:01:48.455
company terminated their
channels for repeated violations

28
00:01:48.455 --> 00:01:52.535
of COVID nineteen and elections
integrity policies that are no

29
00:01:52.535 --> 00:01:55.920
longer in effect. YouTube takes
seriously the importance of

30
00:01:55.920 --> 00:01:59.360
protecting free expression, the
document states elsewhere, and

31
00:01:59.360 --> 00:02:02.880
access to a range of viewpoints.
The document also explicitly

32
00:02:02.880 --> 00:02:06.335
points out that YouTube has not,
does not, and will not employ

33
00:02:06.335 --> 00:02:09.295
any kind of fact checking or
labeling mechanism in its

34
00:02:09.295 --> 00:02:12.735
software. At time of writing, no
list of the channels to be

35
00:02:12.735 --> 00:02:16.095
potentially reinstated has been
published, but I bet it's not

36
00:02:16.095 --> 00:02:18.255
that hard to figure out who
might be on it.

37
00:02:18.255 --> 00:02:24.080
In completely unrelated news,
Alex Jones recently appeared on

38
00:02:24.080 --> 00:02:29.120
his show sporting a Hitler
mustache about which he said, I

39
00:02:29.120 --> 00:02:31.440
could tell you it had a wild
effect on women.

40
00:02:31.440 --> 00:02:32.240
<v Jason Oberholtzer>Ew.

41
00:02:33.360 --> 00:02:36.325
<v Mike Rugnetta>The US Secret
Service shut down a high powered

42
00:02:36.325 --> 00:02:39.045
cellular network that they
claimed posed a threat to tri

43
00:02:39.045 --> 00:02:43.125
state area mobile communications
this week. CBS News reports

44
00:02:43.125 --> 00:02:46.405
that, quote, law enforcement
discovered 300 SIM servers over

45
00:02:46.405 --> 00:02:49.850
a 100,000 SIM cards, enabling
encrypted anonymous

46
00:02:49.850 --> 00:02:53.450
communication and capable of
sending 30,000,000 text messages

47
00:02:53.450 --> 00:02:56.890
per minute that could have,
again, allegedly disabled cell

48
00:02:56.890 --> 00:02:59.450
phone towers and launched a
distributed denial of service

49
00:02:59.450 --> 00:03:02.490
attack with the ability to block
emergency communications like

50
00:03:02.490 --> 00:03:06.335
EMS and police dispatch, end
quote. Secret service claims the

51
00:03:06.335 --> 00:03:09.375
operation was well funded and
possibly under control of state

52
00:03:09.375 --> 00:03:12.975
actors looking to cause trouble
for UN week in New York.

53
00:03:13.055 --> 00:03:16.140
Independent tech auditors and
security analysts are not so

54
00:03:16.140 --> 00:03:19.500
convinced. Well funded, yes, but
capable of causing such

55
00:03:19.500 --> 00:03:23.420
widespread havoc in New York
City of all places, not so much.

56
00:03:23.500 --> 00:03:25.660
There is nothing about this
infrastructure that would be

57
00:03:25.660 --> 00:03:29.580
hugely disruptive or damaging to
mobile phone networks, writes t

58
00:03:29.580 --> 00:03:32.995
profit, the self described
telecom informer for Hacker

59
00:03:32.995 --> 00:03:41.555
Magazine 2,600 on Blue Sky.
BookTok has managed to shoot

60
00:03:41.555 --> 00:03:46.110
Timothy Snyder's lean but
weighty 2017 book on tyranny to

61
00:03:46.110 --> 00:03:49.230
the top of indie bookshop sales
lists over the last few months.

62
00:03:49.230 --> 00:03:52.750
On Tyranny's bullet point style
format and short chapters,

63
00:03:52.750 --> 00:03:55.870
writes Laura Miller for Slate,
make it easy to break into

64
00:03:55.870 --> 00:03:58.830
nuggets of exhortation. A
particular favorite is lesson

65
00:03:58.830 --> 00:04:03.175
number one, do not obey in
advance, urging individuals and

66
00:04:03.175 --> 00:04:07.095
institutions not to appease
authoritarian governments before

67
00:04:07.095 --> 00:04:10.775
they are even asked to. Some
fans on TikTok temporarily turn

68
00:04:10.775 --> 00:04:14.135
over their accounts to On
Tyranny, reading one chapter

69
00:04:14.135 --> 00:04:17.390
aloud per video until they've
narrated the whole thing.

70
00:04:18.910 --> 00:04:22.670
Speaking of TikTok, an alleged
so called framework deal has

71
00:04:22.670 --> 00:04:25.790
been penned regarding the sale
of the Chinese owned platform to

72
00:04:25.790 --> 00:04:29.470
domestic concerns. The US
government and ByteDance have

73
00:04:29.470 --> 00:04:32.315
brokered a forthcoming deal
whereby Oracle, Silver Lake

74
00:04:32.315 --> 00:04:35.435
Technology Management, and
Andreessen Horowitz would

75
00:04:35.435 --> 00:04:39.595
oversee TikTok's US operations.
This group would have an 80%

76
00:04:39.595 --> 00:04:42.715
ownership share, and a member of
the board would be appointed by

77
00:04:42.715 --> 00:04:45.630
the US government. President
Trump has also suggested Fox

78
00:04:45.630 --> 00:04:49.550
News Baron Rupert Murdoch will
likely be involved somehow. The

79
00:04:49.550 --> 00:04:53.310
US based owners would lease
TikTok's infamous algorithm,

80
00:04:53.390 --> 00:04:56.845
which Oracle would oversee and,
quote, retrain.

81
00:04:57.085 --> 00:05:01.165
Larry Ellison, CTO and founder
of Oracle, has also recently

82
00:05:01.165 --> 00:05:04.285
financed a number of large scale
media mergers with his son,

83
00:05:04.285 --> 00:05:08.205
David Ellison. Paramount
Skydance controls CBS, Paramount

84
00:05:08.205 --> 00:05:11.480
Pictures, and the streamer
Paramount Plus. The Ellisons are

85
00:05:11.480 --> 00:05:14.760
also allegedly eyeing a takeover
of Warner Brothers Discovery,

86
00:05:14.840 --> 00:05:28.425
which owns, among other things,
CNN. And finally, get ready for

87
00:05:28.425 --> 00:05:29.705
a really good sentence. You
ready?

88
00:05:29.705 --> 00:05:34.425
Ready for this good sentence?
Here we go. Limewire, relaunched

89
00:05:34.425 --> 00:05:37.865
as an NFT marketplace, has
purchased the rights to the

90
00:05:37.865 --> 00:05:43.510
infamous fire festival brand.
The New York Times reports that

91
00:05:43.510 --> 00:05:47.670
the purchase was made for
245,000 US dollars in an eBay

92
00:05:47.670 --> 00:05:50.950
auction. It is unclear what
Limewire fire will become.

93
00:05:51.185 --> 00:05:54.465
The music downloader turned NFT
peddler is apparently aiming for

94
00:05:54.465 --> 00:05:57.105
something that, quote, expands
beyond the digital realm and

95
00:05:57.105 --> 00:06:00.625
taps into real world
experiences, community, and

96
00:06:00.625 --> 00:06:04.385
surprise, a thing which no doubt
aligns well with the fire

97
00:06:04.385 --> 00:06:11.070
festival brand. Ew. In show news
this week, if you ordered a t

98
00:06:11.070 --> 00:06:14.030
shirt, they are being printed
next week. Once the print is

99
00:06:14.030 --> 00:06:16.510
done, they will head to
Neverpost HQ where they will be

100
00:06:16.510 --> 00:06:22.155
packed and shipped one by one by
hand with love. We will also

101
00:06:22.155 --> 00:06:25.995
have very few stock designs
available at the end of that

102
00:06:25.995 --> 00:06:26.635
process.

103
00:06:26.635 --> 00:06:29.595
I'm gonna let you know in the
show news portion of future

104
00:06:29.595 --> 00:06:32.715
episodes when and where you can
snag those if you missed out.

105
00:06:32.715 --> 00:06:35.115
But when I say very few, I
really mean it. We're gonna

106
00:06:35.115 --> 00:06:40.630
have, like, fewer than 10 stock
shirts. And finally, holy cow,

107
00:06:40.950 --> 00:06:45.910
we are a signal podcast award
finalist in the technology

108
00:06:45.910 --> 00:06:47.270
category. That is fun.

109
00:06:47.270 --> 00:06:51.155
Heck yeah. If you could please
go vote for us, we would love

110
00:06:51.155 --> 00:06:53.795
that. We'll put a link in the
show notes. We are up against

111
00:06:53.795 --> 00:06:57.315
some really rad folks, including
close all tabs, who you may

112
00:06:57.315 --> 00:07:00.995
remember from our hentai segment
and kill switch of whom we're

113
00:07:00.995 --> 00:07:03.630
just generally fans. But please
go vote for us.

114
00:07:03.630 --> 00:07:06.910
We will love you forever. Signal
awards, technology category,

115
00:07:06.910 --> 00:07:09.790
there's a link in the show
notes. Okay. That's the news I

116
00:07:09.790 --> 00:07:12.830
have for you this week. In this
episode, Tory talks with Jason

117
00:07:12.830 --> 00:07:14.430
about AI in the classroom.

118
00:07:14.430 --> 00:07:19.855
But first, BopSpotter is a
project by Riley Walls, and it's

119
00:07:19.855 --> 00:07:25.295
described this way. Somewhere in
the Mission District of San

120
00:07:25.295 --> 00:07:29.455
Francisco is a microphone
pointed down at the street

121
00:07:29.455 --> 00:07:48.125
below. It is using a Shazam
manner. So in our interstitials

122
00:07:48.125 --> 00:07:54.285
this week, Hans took it upon
himself to recreate what he

123
00:07:54.285 --> 00:08:00.450
imagined to be the sonic
environment at the time of some

124
00:08:00.450 --> 00:08:04.850
spotting. So what you are about
to hear are not field

125
00:08:04.850 --> 00:08:09.970
recordings, but carefully
crafted audio collages

126
00:09:09.235 --> 00:10:32.410
<v Jason Oberholtzer>Midnight,
12AM. 03:30AM. So I'm sitting

127
00:10:32.410 --> 00:10:36.090
here at my desk today, watching
the leaves slowly change color

128
00:10:36.090 --> 00:10:39.115
when an email comes in, From
friend of the show, Tori

129
00:10:39.115 --> 00:10:42.635
Dominguez Peek, who submitted to
us a year ago a piece you might

130
00:10:42.635 --> 00:10:46.875
remember, wherein AI chatbot
companies reached out to her

131
00:10:47.035 --> 00:10:50.955
with the proposition of turning
her deceased mother into an AI

132
00:10:50.955 --> 00:10:56.790
chatbot. Well, Tori is back with
another piece. I asked, what's

133
00:10:56.790 --> 00:10:57.430
it about?

134
00:10:57.430 --> 00:11:00.950
No one would let me know. Tori
wanted to tell me herself. So

135
00:11:00.950 --> 00:11:04.390
please, welcome back to
Neverpost, Tori.

136
00:11:04.565 --> 00:11:06.485
Tori Dominguez-Peak: Hey, Jason.
Thanks so much for bringing me

137
00:11:06.485 --> 00:11:06.885
on.

138
00:11:06.885 --> 00:11:09.205
<v Jason Oberholtzer>I'm excited
to learn what I'm about to

139
00:11:09.205 --> 00:11:09.685
learn.

140
00:11:10.165 --> 00:11:15.045
Tori Dominguez-Peak: So today, I
have a tale for you about AI on

141
00:11:15.045 --> 00:11:21.410
college campuses and Brazilian
Portuguese and solving crimes.

142
00:11:24.930 --> 00:11:29.010
<v Jason Oberholtzer>All three of
my biggest interests. Let's get

143
00:11:29.010 --> 00:11:29.410
started.

144
00:11:29.410 --> 00:11:32.215
Tori Dominguez-Peak: Story in
three acts, the Aspheric life.

145
00:11:33.735 --> 00:11:37.175
So Jason, like you said, fall is
in full swing.

146
00:11:37.175 --> 00:11:37.895
<v Jason Oberholtzer>Absolutely.

147
00:11:37.895 --> 00:11:39.735
Tori Dominguez-Peak: I have I
have purchased pumpkin spice

148
00:11:39.735 --> 00:11:43.495
lattes. The leaves are
crunching. Sure. Hans has been

149
00:11:43.495 --> 00:11:44.215
cooking beans.

150
00:11:44.580 --> 00:11:46.900
<v Jason Oberholtzer>Hans has been
cooking beans. I'm up to five or

151
00:11:46.900 --> 00:11:48.820
six layers every time I leave
the house.

152
00:11:48.820 --> 00:11:51.860
Tori Dominguez-Peak: And with
fall happening comes a new

153
00:11:51.860 --> 00:11:56.020
semester on college campuses
everywhere across The US. And so

154
00:11:56.020 --> 00:12:00.165
kind of with that comes the
renewed conversation that people

155
00:12:00.165 --> 00:12:05.045
have been having about AI in
education, and is it cheating,

156
00:12:05.285 --> 00:12:10.245
and like all of the things.
Sure. So earlier this year, the

157
00:12:10.245 --> 00:12:14.540
New Yorker ran this piece with
the title, everyone is cheating

158
00:12:14.540 --> 00:12:18.700
through college. And the whole
crux of it was just talking

159
00:12:18.700 --> 00:12:22.540
about like how commonplace it is
for students to use ChatGPT or

160
00:12:22.540 --> 00:12:26.220
to use it to like help with
assignments or even going as a

161
00:12:26.220 --> 00:12:28.845
part of like, write my term
paper for me.

162
00:12:28.925 --> 00:12:29.325
<v Jason Oberholtzer>Mhmm.

163
00:12:29.325 --> 00:12:31.085
Tori Dominguez-Peak: Blurring
the line between getting it to

164
00:12:31.085 --> 00:12:35.965
help you and then like, what has
become plagiarism. And like, the

165
00:12:35.965 --> 00:12:39.165
most stunning part about that
article to me that I still think

166
00:12:39.165 --> 00:12:42.040
about is they did a very small
survey. It was like a thousand

167
00:12:42.040 --> 00:12:46.040
college students. But 90% of
them had said they had used

168
00:12:46.040 --> 00:12:49.400
ChatGPT to help with homework
assignments.

169
00:12:49.720 --> 00:12:51.160
<v Jason Oberholtzer>And you found
this surprising?

170
00:12:51.640 --> 00:12:54.085
Tori Dominguez-Peak: Yeah. Mean,
just the the number was wild. I

171
00:12:54.085 --> 00:12:57.525
knew it would be over 50. When I
saw 90, I was like, oh, we're

172
00:12:57.525 --> 00:12:58.885
cooked. Okay.

173
00:13:00.165 --> 00:13:04.165
But I just couldn't stop
thinking about like, what

174
00:13:04.165 --> 00:13:08.670
happens when we are letting a
piece of technology kind of do

175
00:13:08.670 --> 00:13:11.390
the thinking for us or do the
talking for us

176
00:13:11.710 --> 00:13:12.190
<v Jason Oberholtzer>Sure.

177
00:13:12.190 --> 00:13:16.590
Tori Dominguez-Peak: On that
scale. And so I decided to talk

178
00:13:16.590 --> 00:13:17.390
to someone.

179
00:13:17.550 --> 00:13:20.430
<v Megan Fritts>When I'm working
on a new paper, I'll be, you

180
00:13:20.430 --> 00:13:23.605
know, typing up some section and
realize I don't know how to

181
00:13:23.605 --> 00:13:27.285
phrase it. And that tells me,
oh, okay, I need to go figure

182
00:13:27.285 --> 00:13:30.005
out what I actually think here.
Because if I can't write about

183
00:13:30.005 --> 00:13:33.205
it, that indicates a lack of
understanding there. So I think

184
00:13:33.205 --> 00:13:36.930
what we're missing when we stop
writing ourselves is the ability

185
00:13:36.930 --> 00:13:39.330
to check ourselves for
misunderstandings.

186
00:13:39.650 --> 00:13:41.810
Tori Dominguez-Peak: So that's a
professor I interviewed. She's

187
00:13:41.810 --> 00:13:45.250
professor Megan Fritz. She
teaches philosophy at University

188
00:13:45.250 --> 00:13:47.490
of Arkansas at Little Rock.

189
00:13:47.730 --> 00:13:48.930
<v Jason Oberholtzer>I find it
really interesting here that

190
00:13:48.930 --> 00:13:52.185
she's using misunderstandings as
a framing. As if like we are

191
00:13:52.185 --> 00:13:56.505
interrogating our own brain when
we set down to write. Yeah. That

192
00:13:56.505 --> 00:14:00.345
feels pretty right to me. Tori,
can I ask you a question

193
00:14:00.345 --> 00:14:00.985
quickly?

194
00:14:01.145 --> 00:14:01.865
Tori Dominguez-Peak: Sure.

195
00:14:02.425 --> 00:14:03.865
<v Jason Oberholtzer>Did you cheat
in college?

196
00:14:05.070 --> 00:14:06.910
Tori Dominguez-Peak: No. I
cheated in high school though.

197
00:14:06.910 --> 00:14:09.310
<v Jason Oberholtzer>Did you cheat
in ways that you think

198
00:14:09.470 --> 00:14:13.070
fundamentally changed your
understanding or inhibited your

199
00:14:13.070 --> 00:14:15.630
understanding of what you were
doing? Or did they just help you

200
00:14:15.630 --> 00:14:16.430
get a better grade?

201
00:14:19.015 --> 00:14:20.855
Tori Dominguez-Peak: Okay. Let
me just lay out the one scenario

202
00:14:20.855 --> 00:14:23.335
that I cheated and I could help
you could help me here.

203
00:14:23.335 --> 00:14:23.735
<v Jason Oberholtzer>Mhmm.

204
00:14:23.735 --> 00:14:27.095
Tori Dominguez-Peak: So I was
failing chemistry. And so one

205
00:14:27.095 --> 00:14:29.975
thing I noticed was that kids
who took a long time to take the

206
00:14:29.975 --> 00:14:34.410
test, it was like third period.
And then you had fourth period

207
00:14:34.410 --> 00:14:37.450
and then there was lunch. And so
I was like, oh, if I just take

208
00:14:37.450 --> 00:14:41.850
forever on this test, I can
finish it later and study for it

209
00:14:41.850 --> 00:14:43.770
during fourth period.

210
00:14:43.770 --> 00:14:44.250
<v Jason Oberholtzer>Yeah.

211
00:14:44.250 --> 00:14:46.330
Tori Dominguez-Peak: And so I
was just like, oh, it's just

212
00:14:46.330 --> 00:14:49.585
taking me forever. I have a
headache. Like, I guess I have

213
00:14:49.585 --> 00:14:51.905
to come back for lunch. And
like, come back during lunch and

214
00:14:51.905 --> 00:14:55.585
finish up this test. Bell rings,
I go to fourth period.

215
00:14:56.065 --> 00:14:59.825
It's like my study hall period.
I'm like study I'm like going

216
00:14:59.825 --> 00:15:03.905
over the stoichiometry formulas.
I'm hitting the books. And then

217
00:15:04.290 --> 00:15:07.010
I lunch happens. I go back to
the chem room.

218
00:15:07.250 --> 00:15:10.530
I actually reanswered some stuff
because I was like, I have it in

219
00:15:10.530 --> 00:15:11.570
my brain fresh now.

220
00:15:11.570 --> 00:15:12.210
<v Jason Oberholtzer>Beautiful.

221
00:15:12.210 --> 00:15:14.610
Tori Dominguez-Peak: And I took
it. Is that cheating? It kind of

222
00:15:14.610 --> 00:15:15.090
is.

223
00:15:15.090 --> 00:15:16.690
<v Jason Oberholtzer>I well, yeah.
I mean, by the letter of the

224
00:15:16.690 --> 00:15:19.625
law, but it's gamesmanship is
what I think it is. Like, you

225
00:15:19.625 --> 00:15:20.105
still

226
00:15:20.265 --> 00:15:20.985
Tori Dominguez-Peak: the player.

227
00:15:20.985 --> 00:15:23.065
<v Jason Oberholtzer>Like Yeah.
You still walked in there with

228
00:15:23.065 --> 00:15:26.505
the requisite knowledge or the
understanding of how to find the

229
00:15:26.505 --> 00:15:29.465
knowledge, and you applied your
brain to the problems at hand

230
00:15:29.465 --> 00:15:33.260
and Yeah. Got a better grade to
them. Under the framework that

231
00:15:33.260 --> 00:15:37.580
professor Fritz is setting out
here, that seems to be a

232
00:15:37.580 --> 00:15:41.740
different kind of malfeasance in
the classroom. And one that I

233
00:15:41.740 --> 00:15:43.260
perhaps look more fondly on.

234
00:15:43.260 --> 00:15:44.300
I cheated constantly.

235
00:15:44.535 --> 00:15:44.935
Tori Dominguez-Peak: Okay.

236
00:15:44.935 --> 00:15:47.575
<v Jason Oberholtzer>Either buy
more time because I had not

237
00:15:47.575 --> 00:15:52.055
prepared myself sufficiently on
time, or route myself around

238
00:15:52.055 --> 00:15:55.015
rote memorization which I
considered to be an impediment

239
00:15:55.015 --> 00:15:57.815
to learning and not a benchmark
by which you measured learning.

240
00:15:58.340 --> 00:16:01.300
And honestly, resented having to
regurgitate things that one

241
00:16:01.300 --> 00:16:04.180
could find in a book onto a page
later. So I count neither of

242
00:16:04.180 --> 00:16:07.140
those things as cheating. But
like the thinking that I had to

243
00:16:07.140 --> 00:16:09.780
do with that information still
happened in my head and hit the

244
00:16:09.780 --> 00:16:09.940
page.

245
00:16:10.975 --> 00:16:13.135
Tori Dominguez-Peak: That's the
thing. Like, I was still

246
00:16:13.135 --> 00:16:13.935
studying.

247
00:16:14.095 --> 00:16:14.335
<v Jason Oberholtzer>Yeah.

248
00:16:14.335 --> 00:16:16.575
Tori Dominguez-Peak: I just
convinced my chem teacher that I

249
00:16:16.575 --> 00:16:20.975
had a headache when I didn't.
Perfect. Right? Yeah. But I was

250
00:16:20.975 --> 00:16:23.855
not plugging formulas in the
chattypety and being like, what

251
00:16:23.855 --> 00:16:25.455
are the answers to these
questions?

252
00:16:26.170 --> 00:16:29.290
Which I think is kind of
different.

253
00:16:29.450 --> 00:16:30.010
<v Jason Oberholtzer>Sure.

254
00:16:30.090 --> 00:16:31.450
Tori Dominguez-Peak: And
professor Fritz wrote this

255
00:16:31.450 --> 00:16:35.130
article for the Chronicle of
Higher Education, and I will

256
00:16:35.130 --> 00:16:38.010
read you out loud an excerpt
because I think it says

257
00:16:38.010 --> 00:16:41.185
something like really poignant.
It says, we're not simply

258
00:16:41.185 --> 00:16:44.625
frustrated by just trying to
police AI use or the labor of

259
00:16:44.625 --> 00:16:48.065
having to write up students for
academic dishonesty or the way

260
00:16:48.065 --> 00:16:51.345
that reading student work has
become a rather nihilistic task.

261
00:16:51.950 --> 00:16:55.630
Our frustration is not merely
that we don't care what AI has

262
00:16:55.630 --> 00:16:59.950
to say and therefore get bored
grading papers. It is that we

263
00:16:59.950 --> 00:17:03.710
actively miss reading the
thoughts of our human students.

264
00:17:05.105 --> 00:17:10.545
<v Jason Oberholtzer>That is so
dispiriting. Wow. Famously easy

265
00:17:10.545 --> 00:17:11.665
job gets easier.

266
00:17:11.665 --> 00:17:15.185
Tori Dominguez-Peak: It's kind
of a bummer. And she is like

267
00:17:15.185 --> 00:17:18.225
hitting something here. Like,
when you write down something on

268
00:17:18.225 --> 00:17:21.970
a page, you are transmuting your
thoughts onto a page. And when

269
00:17:21.970 --> 00:17:24.930
you turn it in, your instructor
is reading your thoughts. Like,

270
00:17:24.930 --> 00:17:28.930
we know there is a relationship
between writing and thinking.

271
00:17:28.930 --> 00:17:31.010
<v Jason Oberholtzer>Right.
Exactly. And I like that she's

272
00:17:31.010 --> 00:17:33.405
extending it to like a
relationship between people on

273
00:17:33.405 --> 00:17:35.725
either side of that activity.
You will have a relationship

274
00:17:35.725 --> 00:17:37.725
with your thoughts to the
writing and the people reading

275
00:17:37.725 --> 00:17:40.045
your writing have a relationship
to those thoughts and therefore

276
00:17:40.045 --> 00:17:43.885
you. And I think that is
probably one of the great joys

277
00:17:43.885 --> 00:17:46.445
of teaching is to be in
relationship with those people

278
00:17:46.445 --> 00:17:47.245
via their thoughts.

279
00:17:48.210 --> 00:17:51.170
Tori Dominguez-Peak: Yeah. And I
I kind of got to this sort of

280
00:17:51.170 --> 00:17:54.450
thing like writing and thinking
and like how they're related.

281
00:17:54.530 --> 00:17:57.970
And I asked her, hey, is writing
basically the same thing as

282
00:17:57.970 --> 00:18:02.035
thinking or are they kind of
intertwined in some way? And she

283
00:18:02.035 --> 00:18:05.635
was like, yeah, they definitely
are related. And that we write

284
00:18:05.635 --> 00:18:08.915
it to document our thoughts, but
we also write to come up with

285
00:18:08.915 --> 00:18:09.555
thoughts.

286
00:18:09.555 --> 00:18:13.475
It's kind of this really unique
like both and relationship.

287
00:18:13.555 --> 00:18:15.670
<v Jason Oberholtzer>Yeah. You
definitely you write to find

288
00:18:15.670 --> 00:18:18.310
when you have to stop writing
because you don't know what's

289
00:18:18.310 --> 00:18:20.870
there, and then you have to go
think about it.

290
00:18:21.110 --> 00:18:23.750
Tori Dominguez-Peak: Yeah. I
mean, whenever I write a script

291
00:18:23.750 --> 00:18:27.510
for Neverpost, I will write like
a paragraph and then walk away,

292
00:18:27.510 --> 00:18:29.990
and then come back like two days
later. And then, you know, it's

293
00:18:29.990 --> 00:18:33.645
just it's a slow process. But
it's because I'm in relation

294
00:18:33.645 --> 00:18:38.685
with my brain and trying to
figure it out. And then I'm also

295
00:18:38.685 --> 00:18:42.365
doing something that is also
very thinking heavy, which is

296
00:18:42.365 --> 00:18:43.725
I'm learning a second language.

297
00:18:44.300 --> 00:18:49.020
I spoke more Spanish as a kid
growing up in a Latin American

298
00:18:49.020 --> 00:18:53.100
household. And then I lost it as
it became a teenager. And then

299
00:18:53.100 --> 00:18:56.620
I'm trying to get back into it
as an adult. Mhmm. And so like,

300
00:18:56.935 --> 00:19:00.215
when I speak Spanish in my adult
learner's Spanish class, I'm

301
00:19:00.215 --> 00:19:04.375
like having a thought in English
and then translating it in my

302
00:19:04.375 --> 00:19:07.975
head and then saying it out loud
to them in Spanish, right, to my

303
00:19:07.975 --> 00:19:08.535
instructor.

304
00:19:08.535 --> 00:19:11.580
Sure. And then she says
something back to me in Spanish,

305
00:19:11.580 --> 00:19:14.140
and then I'm trying to translate
it into English in my head. And

306
00:19:14.140 --> 00:19:18.780
it's just this very like
mechanical relationship, and

307
00:19:18.780 --> 00:19:21.980
it's not easy. And I feel a
little bit like a baby alerting

308
00:19:21.980 --> 00:19:23.820
to speak for the first time.

309
00:19:23.820 --> 00:19:26.395
<v Jason Oberholtzer>Yeah. Okay.
So if I'm if I'm hearing this

310
00:19:26.395 --> 00:19:30.395
right then, it's like you're
sort of seeing the the the gap

311
00:19:30.395 --> 00:19:33.035
between the thought that you're
having and your ability to

312
00:19:33.035 --> 00:19:38.555
articulate it in this seconds
now. I guess, re seconds, third,

313
00:19:38.555 --> 00:19:42.050
second, again, language.
Probably especially because at

314
00:19:42.050 --> 00:19:43.730
one point, it was not there.

315
00:19:43.730 --> 00:19:46.450
And you're feeling this, like,
this break in the chain between

316
00:19:46.450 --> 00:19:49.410
you having a thought and being
able to articulate that. Yeah.

317
00:19:49.410 --> 00:19:53.665
And that feels sort of similar
to what you think is happening

318
00:19:53.665 --> 00:19:58.065
with the insertion of these AI
tools into the way people are

319
00:19:58.065 --> 00:19:59.265
writing these days.

320
00:19:59.425 --> 00:20:01.105
Tori Dominguez-Peak: Yeah. And
professor Fritz kind of brought

321
00:20:01.105 --> 00:20:04.865
this up that AI is kind of this
middleman between thought and

322
00:20:04.865 --> 00:20:06.305
language that's never been there
before.

323
00:20:06.760 --> 00:20:08.920
<v Megan Fritts>The difference
between being a native speaker

324
00:20:08.920 --> 00:20:11.560
of a language and being someone
who's learned a language, I

325
00:20:11.560 --> 00:20:15.720
think is is the perfect example
of what we are risking, when we

326
00:20:15.720 --> 00:20:21.465
use generative AI for our
writing and, speaking, that we

327
00:20:21.465 --> 00:20:24.905
risk going from this kind of
native speaker status to a a a

328
00:20:24.905 --> 00:20:27.465
situation where we have to if we
want to have these skills at

329
00:20:27.465 --> 00:20:30.265
all, we have to reteach it to
ourselves in a in a really

330
00:20:30.265 --> 00:20:31.225
artificial way.

331
00:20:31.465 --> 00:20:34.690
<v Jason Oberholtzer>So is the
concern there almost like what

332
00:20:34.690 --> 00:20:37.570
happens in its absence? Or like
it's like what happens if you're

333
00:20:37.570 --> 00:20:41.170
without your Spanish English
dictionary as it were Yeah. That

334
00:20:41.170 --> 00:20:43.330
you'd actually don't have
control over the language.

335
00:20:43.730 --> 00:20:47.970
Tori Dominguez-Peak: Yeah. It it
feels like if we outsource for

336
00:20:47.970 --> 00:20:52.095
example, you didn't do the
reading for your college class,

337
00:20:52.095 --> 00:20:53.855
but you have to write paper
about it and you're just like,

338
00:20:53.935 --> 00:20:57.935
shitty, this paper about this
thing. You didn't engage with

339
00:20:57.935 --> 00:21:01.135
the text. Yeah. You didn't
transmit the text into writing.

340
00:21:01.375 --> 00:21:03.855
And so like, you're probably not
even remember what that class

341
00:21:03.855 --> 00:21:07.310
was about. Yeah. You're losing
some type of critical like brain

342
00:21:07.310 --> 00:21:11.790
step that helps you metabolize
information. Does that make

343
00:21:11.790 --> 00:21:14.510
sense? Like, I think writing
helps you metabolize

344
00:21:14.510 --> 00:21:15.390
information.

345
00:21:15.550 --> 00:21:20.105
And like, if I could just be
candid, I think writing is kind

346
00:21:20.105 --> 00:21:21.705
of mentally painful for me.

347
00:21:21.705 --> 00:21:23.305
<v Jason Oberholtzer>Oh, yeah.
That's the whole thing about

348
00:21:23.305 --> 00:21:23.705
writing.

349
00:21:23.705 --> 00:21:24.185
Tori Dominguez-Peak: Like It

350
00:21:24.185 --> 00:21:25.305
<v Jason Oberholtzer>hurts and
it's bad.

351
00:21:25.305 --> 00:21:27.305
Tori Dominguez-Peak: It hurts
and it's bad, and that's the

352
00:21:27.305 --> 00:21:30.425
whole point. And I will write
something and I'll walk away and

353
00:21:30.425 --> 00:21:33.490
I come I will come back three
days later, And I'm just like,

354
00:21:33.490 --> 00:21:35.970
oh, this psychologically hurts.
But then you get into a groove

355
00:21:35.970 --> 00:21:39.170
and then it feels good and
you've kind of metabolized your

356
00:21:39.170 --> 00:21:42.450
thoughts and it comes out and it
feels great. And there's just

357
00:21:42.450 --> 00:21:45.970
such an emotional experience in
that. And so when you're just

358
00:21:45.970 --> 00:21:51.525
like, hey chat, GBT, write this
podcast episode, summarize the

359
00:21:51.525 --> 00:21:53.925
notes from this interview I have
with the source, or write the

360
00:21:53.925 --> 00:21:55.125
interview questions.

361
00:21:55.285 --> 00:21:55.605
<v Jason Oberholtzer>Yeah.

362
00:21:55.605 --> 00:21:58.965
Tori Dominguez-Peak: You're kind
of losing some threads.

363
00:21:59.525 --> 00:22:01.605
<v Jason Oberholtzer>Yeah. You
know, not to be the person who's

364
00:22:01.605 --> 00:22:04.980
constantly defending cheating
here. But when you brought up

365
00:22:05.140 --> 00:22:08.740
writing the report in the book
you have not read, that is

366
00:22:08.740 --> 00:22:11.380
something I believe a It's lot
of us have a classic move. You

367
00:22:11.380 --> 00:22:14.420
have to try it at some point.
And when you do that, you

368
00:22:14.420 --> 00:22:17.845
marshal enough information about
the book, you skim it, you look

369
00:22:17.845 --> 00:22:20.325
up some notes, you try to find
as much as you can to walk in

370
00:22:20.325 --> 00:22:20.885
there.

371
00:22:21.125 --> 00:22:26.325
But the thing you're doing when
you walk in there is like using

372
00:22:26.325 --> 00:22:30.005
your brain. It is learning. It
is performing. It is something

373
00:22:30.005 --> 00:22:33.670
that requires you to undergo a
process that will help you be a

374
00:22:33.670 --> 00:22:37.190
better thinker and communicator
in the future because you are

375
00:22:37.190 --> 00:22:42.230
actually doing a task. And to
me, what feels scary about this

376
00:22:42.310 --> 00:22:45.910
is that it is removing the
mental load of doing the task.

377
00:22:46.325 --> 00:22:49.685
Mhmm. Not so much cheating like
the information, which is like

378
00:22:49.685 --> 00:22:52.565
the veneer around which we all
do the process of learning, but

379
00:22:52.565 --> 00:22:54.885
it's removing the actual mental
task, which is the point of

380
00:22:54.885 --> 00:22:58.485
sitting down and being a part of
a university or a class or

381
00:22:58.485 --> 00:23:02.500
whatever the case may be. So I
feel like this has to feel

382
00:23:02.500 --> 00:23:04.900
different for teachers. Like,
they've walked into campuses

383
00:23:04.900 --> 00:23:07.460
every fall for millennia and
been like, alright, everyone's

384
00:23:07.460 --> 00:23:09.940
cheating. How do I make sure
that I know that they are smart

385
00:23:09.940 --> 00:23:12.900
enough to continue down the road
after they continue cheating?

386
00:23:12.900 --> 00:23:15.620
Like, I I that's probably
unlikely that people believe

387
00:23:15.620 --> 00:23:19.035
they have a complete fail proof
method to stop all cheating

388
00:23:19.035 --> 00:23:21.595
forever. Do you think that
professor Fritz or other

389
00:23:21.595 --> 00:23:25.515
professors are feeling like this
is a different kind of stop

390
00:23:25.515 --> 00:23:28.155
cheating move they need to make?

391
00:23:28.635 --> 00:23:34.070
Tori Dominguez-Peak: Yeah. So
professor Fritz has kind of gone

392
00:23:34.630 --> 00:23:40.070
nuclear. Okay. She instated a
policy in her classroom that is

393
00:23:40.070 --> 00:23:43.750
just like, I am banning all AI
from my classroom. Okay.

394
00:23:43.750 --> 00:23:47.645
It includes ChatGPT. It even
includes like Grammarly, which I

395
00:23:47.645 --> 00:23:50.925
use Grammarly to like make sure
that my emails aren't misspelled

396
00:23:50.925 --> 00:23:51.485
or whatever.

397
00:23:51.485 --> 00:23:52.045
<v Mike Rugnetta>Oh, interesting.
Not even

398
00:23:52.045 --> 00:23:53.645
Tori Dominguez-Peak: too many
exclamation points. She's like,

399
00:23:53.645 --> 00:23:57.325
nope, not even that. Because
Grammarly can suggest rewrites.

400
00:23:57.325 --> 00:23:57.645
<v Jason Oberholtzer>Yeah.

401
00:23:57.645 --> 00:23:59.885
Tori Dominguez-Peak: And that in
itself is kind of generative AI.

402
00:23:59.965 --> 00:24:00.125
<v Jason Oberholtzer>Yeah.

403
00:24:00.860 --> 00:24:03.100
Tori Dominguez-Peak: And so I
asked her like, okay, so how are

404
00:24:03.100 --> 00:24:04.060
you enforcing this?

405
00:24:04.060 --> 00:24:04.460
<v Jason Oberholtzer>Sure.

406
00:24:04.460 --> 00:24:06.620
Tori Dominguez-Peak: It just
seems like a lot of work to

407
00:24:06.620 --> 00:24:11.100
enforce this. And she has a very
interesting way of going about

408
00:24:11.100 --> 00:24:16.205
this. So at some point earlier
in the semester, she has them

409
00:24:16.205 --> 00:24:19.885
write these short essay type
assignments in class. They are

410
00:24:19.885 --> 00:24:24.205
handwritten, and they turn it
into her right there, hard copy.

411
00:24:24.445 --> 00:24:27.245
And so it's zero chance of AI
use.

412
00:24:27.245 --> 00:24:30.125
She had you write with a pen and
paper. Yeah. And she keeps these

413
00:24:30.125 --> 00:24:33.740
essays as kind of evidence of
like, this is how these people

414
00:24:33.740 --> 00:24:37.500
write. Oh. This is what your
voice, your narrative voice is

415
00:24:37.500 --> 00:24:38.060
like.

416
00:24:38.780 --> 00:24:41.100
And so then later on in the
semester, you turn in something

417
00:24:41.100 --> 00:24:45.795
electronically and it's got AI
written stuff, she's gonna be

418
00:24:45.795 --> 00:24:47.395
like, this doesn't sound like
you.

419
00:24:47.395 --> 00:24:51.075
<v Jason Oberholtzer>Okay. Does
she do the comparison process

420
00:24:51.075 --> 00:24:54.435
herself or does she let AI do
the comparisons?

421
00:24:54.755 --> 00:24:57.120
Tori Dominguez-Peak: So she
kinda does both. Interesting.

422
00:24:57.200 --> 00:25:00.560
She does use eight different AI
detection programs that she runs

423
00:25:00.560 --> 00:25:02.320
it through, which is Wow. It's a
lot.

424
00:25:02.320 --> 00:25:03.680
<v Jason Oberholtzer>Does the
school pay for these?

425
00:25:04.000 --> 00:25:06.000
Tori Dominguez-Peak: I don't
know. That's great question.

426
00:25:06.800 --> 00:25:08.000
Don't seem cheap, do they?

427
00:25:08.000 --> 00:25:09.360
<v Jason Oberholtzer>No. I would
imagine not.

428
00:25:11.285 --> 00:25:12.805
Tori Dominguez-Peak: And so that
she runs it through all of

429
00:25:12.805 --> 00:25:18.005
those. She also just looks at it
herself and is like, yeah, I

430
00:25:18.005 --> 00:25:18.725
could tell.

431
00:25:18.885 --> 00:25:20.805
<v Jason Oberholtzer>Yeah. Of
course. Right? Like, teachers

432
00:25:20.805 --> 00:25:22.565
have been seeing this, like,
forever.

433
00:25:22.805 --> 00:25:25.525
Tori Dominguez-Peak: But I did
ask, like, can you just tell off

434
00:25:25.525 --> 00:25:28.820
the bat of a student's writing
as AI generated just by looking

435
00:25:28.820 --> 00:25:34.340
at their paper? And she said,
yes. And that there's usually a

436
00:25:34.340 --> 00:25:35.300
couple of clues.

437
00:25:35.300 --> 00:25:37.220
<v Jason Oberholtzer>Okay. Is this
where you're coming from my Em

438
00:25:37.220 --> 00:25:41.285
dashes? Yeah. Alright. I'll
listen to it at least.

439
00:25:41.525 --> 00:25:44.565
Tori Dominguez-Peak: The first
clue is she calls them 50¢

440
00:25:44.565 --> 00:25:49.125
words. Weird, like formal words
that the typical 18 year old

441
00:25:49.125 --> 00:25:52.085
undergrad would just not be
using. Okay.

442
00:25:52.840 --> 00:25:56.680
<v Megan Fritts>An example of that
is like the word want, w o n t,

443
00:25:56.680 --> 00:26:01.160
where so you might use it in a
sentence like, I want to take a

444
00:26:01.160 --> 00:26:03.640
walk in the morning. So it's
like talking like an

445
00:26:03.640 --> 00:26:09.545
inclination. That's a 50¢ word
that I I I would say most of my

446
00:26:09.545 --> 00:26:12.585
students probably aren't just
casually using in their

447
00:26:12.585 --> 00:26:13.785
reflection writings.

448
00:26:14.025 --> 00:26:16.105
<v Jason Oberholtzer>Now you're
coming from my vocabulary.

449
00:26:16.265 --> 00:26:17.705
First, my em dashes.

450
00:26:17.785 --> 00:26:19.385
Tori Dominguez-Peak: Are you
want to use wants?

451
00:26:19.385 --> 00:26:22.970
<v Jason Oberholtzer>I mean, of
course, I am. But like, where in

452
00:26:22.970 --> 00:26:26.410
an academic setting would I ever
run across the need to express

453
00:26:26.410 --> 00:26:29.050
my feelings through the word
want?

454
00:26:29.130 --> 00:26:31.370
Tori Dominguez-Peak: Yeah.
Exactly. Clue number two, and

455
00:26:31.370 --> 00:26:34.810
you were pretty right about
this, was Em dashes. Yeah. It's

456
00:26:34.810 --> 00:26:38.765
not just any Em dash because I I
I feel you, like, it sucks that

457
00:26:38.765 --> 00:26:41.325
em dashes have become like this
weird red flag and it's like, I

458
00:26:41.325 --> 00:26:42.285
like a good em dash.

459
00:26:42.285 --> 00:26:42.925
<v Jason Oberholtzer>Sure.

460
00:26:43.005 --> 00:26:46.285
Tori Dominguez-Peak: She said
that there's a type of em dash

461
00:26:46.285 --> 00:26:48.045
that is a red flag to her.

462
00:26:48.540 --> 00:26:50.700
<v Megan Fritts>An acquaintance of
mine, another philosophy

463
00:26:50.700 --> 00:26:54.460
professor, he calls those
epiphany dashes, where you go

464
00:26:54.460 --> 00:26:58.380
from an ordinary, you know,
thought like it's not just a

465
00:26:58.380 --> 00:27:02.380
walk em dash, it's a
brainstorming session. This is,

466
00:27:02.380 --> 00:27:03.500
you know, you're having an
epiphany.

467
00:27:04.055 --> 00:27:06.695
<v Jason Oberholtzer>Interesting.
Yeah. So there's like an emotive

468
00:27:06.695 --> 00:27:09.175
component to the Emdash when
it's used this way.

469
00:27:09.255 --> 00:27:11.095
Tori Dominguez-Peak: Some some
sense it sounds like a LinkedIn

470
00:27:11.095 --> 00:27:14.215
post. You know what I mean?
Like, has that type type of

471
00:27:14.215 --> 00:27:15.175
cadence for it.

472
00:27:15.175 --> 00:27:16.855
<v Jason Oberholtzer>Right. It's
like It's sort of a it's like

473
00:27:16.855 --> 00:27:21.200
copywriting usage not Yeah.
Yeah. Yeah. Oh, that's super

474
00:27:21.200 --> 00:27:21.760
interesting.

475
00:27:21.760 --> 00:27:23.680
Tori Dominguez-Peak: And then
obviously, the third main clue

476
00:27:23.680 --> 00:27:26.720
is like she runs it through one
of her software programs and

477
00:27:26.720 --> 00:27:30.880
it's like, bing, AI generated.
But yeah, mean, it's kind of an

478
00:27:30.880 --> 00:27:31.600
intensive process.

479
00:27:32.175 --> 00:27:34.255
<v Jason Oberholtzer>Yeah. It
sounds exhausting. I guess for

480
00:27:34.255 --> 00:27:36.895
everyone, I suppose. Like, are
the students having a good time

481
00:27:36.895 --> 00:27:37.935
while this is happening?

482
00:27:38.415 --> 00:27:40.415
Tori Dominguez-Peak: They don't
seem to be big fans.

483
00:27:40.415 --> 00:27:40.975
<v Jason Oberholtzer>Sure.

484
00:27:41.055 --> 00:27:43.775
Tori Dominguez-Peak: As you can
imagine, a lot of them have

485
00:27:43.775 --> 00:27:47.500
reacted by saying, well, I don't
get why you've banned it because

486
00:27:47.500 --> 00:27:50.860
my other professors and other
classes don't care. So like, why

487
00:27:50.860 --> 00:27:51.740
should it matter?

488
00:27:51.740 --> 00:27:53.980
<v Jason Oberholtzer>Oh, jeez.
Alright. Well, at least they're

489
00:27:53.980 --> 00:27:55.900
still learning something and
that something is emotional

490
00:27:55.900 --> 00:27:56.780
manipulation.

491
00:27:57.260 --> 00:27:58.860
Tori Dominguez-Peak: Yeah. I
guess they gotta learn to read

492
00:27:58.860 --> 00:27:59.660
the syllabus.

493
00:27:59.980 --> 00:28:00.380
<v Jason Oberholtzer>Yeah.

494
00:28:00.380 --> 00:28:04.355
Tori Dominguez-Peak: But it does
kind of bring up something which

495
00:28:04.355 --> 00:28:09.315
is that, okay, some professors
don't care. Professor Fritz very

496
00:28:09.315 --> 00:28:12.595
much does. Yeah. And so college
students are kind of navigating

497
00:28:12.595 --> 00:28:15.235
this landscape where it's like
in the same semester, they might

498
00:28:15.235 --> 00:28:18.080
have someone who's a real
stickler about this stuff. And

499
00:28:18.080 --> 00:28:21.200
they may also have a different
teacher who doesn't care.

500
00:28:21.280 --> 00:28:26.160
And it seems like they have to
navigate all these individual AI

501
00:28:26.160 --> 00:28:27.120
policies.

502
00:28:27.440 --> 00:28:30.455
<v Jason Oberholtzer>Or just not
use AI? That's one way to

503
00:28:30.455 --> 00:28:31.735
navigate all of them.

504
00:28:31.815 --> 00:28:35.815
Tori Dominguez-Peak: Or just not
use it at all. But I asked Fritz

505
00:28:35.815 --> 00:28:38.535
about this like, oh, do you talk
with other professors about

506
00:28:38.535 --> 00:28:41.815
handling this? And she said that
she actually sits on a couple of

507
00:28:41.815 --> 00:28:43.895
AI related committees at her
university.

508
00:28:44.190 --> 00:28:48.990
<v Megan Fritts>I think instructor
uniformity and solidarity on

509
00:28:48.990 --> 00:28:52.990
this issue is pretty important
for our students. I thought it

510
00:28:52.990 --> 00:28:56.670
was a good idea, I was excited
to to to try to make this

511
00:28:56.670 --> 00:28:59.565
policy. But what ended up
happening is that people just

512
00:28:59.565 --> 00:29:06.925
had such different views on AI
use in higher education that it

513
00:29:06.925 --> 00:29:15.080
kind of just turned into debate
every every meeting, and we we

514
00:29:15.080 --> 00:29:18.120
have not yet made any kind of a
policy.

515
00:29:18.600 --> 00:29:20.920
<v Jason Oberholtzer>Okay. Well, I
suppose that's predictable. It's

516
00:29:20.920 --> 00:29:24.360
academia. It's meetings. It's
consensus.

517
00:29:24.360 --> 00:29:27.755
That is not necessarily easy to
do, but, like, you know, I'm not

518
00:29:27.755 --> 00:29:30.475
gonna tip my hand on where I
stand on this. I'm sure everyone

519
00:29:30.475 --> 00:29:33.515
is in deep mystery here. But
like, if it's working for

520
00:29:33.915 --> 00:29:37.995
professor Fritz, like, just let
her do her thing. It seems like

521
00:29:37.995 --> 00:29:38.475
it works.

522
00:29:38.475 --> 00:29:40.555
Tori Dominguez-Peak: Yeah. I
mean, she thinks that she feels

523
00:29:40.555 --> 00:29:45.400
that her system works well for
her. Yeah. But her way of doing

524
00:29:45.400 --> 00:29:49.800
things relies on knowing what
these students write like, and

525
00:29:49.800 --> 00:29:52.360
knowing what they don't write
like, and they're writing as a

526
00:29:52.360 --> 00:29:56.645
sort of fingerprint. But what
happens when you can't really

527
00:29:56.645 --> 00:29:59.285
tell and you have to do some
detective work?

528
00:29:59.445 --> 00:30:00.005
<v Jason Oberholtzer>Mhmm.

529
00:30:00.165 --> 00:30:02.405
Tori Dominguez-Peak: So I found
a detective.

530
00:30:02.725 --> 00:30:04.325
<v Jason Oberholtzer>Alright,
listeners. This is the most

531
00:30:04.325 --> 00:30:09.180
podcast break we will ever do
after these messages, a

532
00:30:09.180 --> 00:30:09.740
detective.

533
00:30:26.435 --> 00:30:29.395
Rui Sousa-Silva: Even though we
learn the same languages from

534
00:30:29.395 --> 00:30:33.235
the same books and we learn we
can find the same words in the

535
00:30:33.235 --> 00:30:37.235
same dictionaries, the way each
one of us uses language is

536
00:30:37.235 --> 00:30:42.040
different. So we have a let's
call it a different style of

537
00:30:42.040 --> 00:30:43.160
using language.

538
00:30:45.320 --> 00:30:49.880
Tori Dominguez-Peak: So this is
doctor Rui Sosasilva. He's from

539
00:30:49.880 --> 00:30:54.555
Portugal. His whole training and
job is as a forensic linguist.

540
00:30:54.555 --> 00:30:58.155
Oh. So like, literally, his
field is all about confirming

541
00:30:58.315 --> 00:31:00.555
the identity of who wrote what.

542
00:31:00.635 --> 00:31:03.035
<v Jason Oberholtzer>Woah. That's
a rad job.

543
00:31:03.275 --> 00:31:04.955
Tori Dominguez-Peak: I know.
It's such a cool job. And I

544
00:31:04.955 --> 00:31:07.890
literally didn't even know this
job existed until I started

545
00:31:07.890 --> 00:31:10.850
writing this episode. And I was
like, this is amazing. Wow.

546
00:31:11.810 --> 00:31:17.170
And being able to identify the
nuances of what someone sounds

547
00:31:17.170 --> 00:31:20.530
like, that identity is called an
idiolect.

548
00:31:21.345 --> 00:31:27.105
Rui Sousa-Silva: So ideolect is
your own way of speaking or

549
00:31:27.105 --> 00:31:31.985
writing the language. So it's as
if your DNA was related to the

550
00:31:31.985 --> 00:31:32.945
way you use language.

551
00:31:33.420 --> 00:31:34.460
<v Jason Oberholtzer>I believe it.

552
00:31:34.540 --> 00:31:38.380
Tori Dominguez-Peak: So I mean,
like, legally, criminally,

553
00:31:38.860 --> 00:31:42.540
historically, a forensic
linguist is the person you call

554
00:31:42.540 --> 00:31:46.300
to match the fingerprints of
someone's writing. Right?

555
00:31:46.300 --> 00:31:46.620
<v Jason Oberholtzer>Woah.

556
00:31:47.155 --> 00:31:49.715
Tori Dominguez-Peak: And so at
the same time, he's also a

557
00:31:49.715 --> 00:31:53.795
lecturer. He also teaches, which
means that he also has to deal

558
00:31:53.795 --> 00:31:59.235
with the issue of students using
AI in his class. Really also has

559
00:31:59.235 --> 00:32:03.600
students write in his class sit
down and write. I know you're

560
00:32:03.600 --> 00:32:05.000
not using AI because I can look
at you writing. Right?

561
00:32:06.000 --> 00:32:10.320
But when they do that, he's
noticing something different

562
00:32:10.560 --> 00:32:12.240
than Professor Fritz.

563
00:32:12.480 --> 00:32:14.880
Rui Sousa-Silva: What we see
nowadays, people interact with

564
00:32:14.880 --> 00:32:20.005
generative AI so much that
people are starting to write

565
00:32:20.005 --> 00:32:25.445
like machines. With some of my
students, I know that they are

566
00:32:25.525 --> 00:32:29.525
sitting an exam and I know they
were the ones who wrote the text

567
00:32:29.800 --> 00:32:33.480
and still when I read the text
it sounds as if it was generated

568
00:32:33.480 --> 00:32:38.440
by a machine. And that's because
we accommodate with other people

569
00:32:38.520 --> 00:32:41.240
and we accommodate in the same
way with the machines we

570
00:32:41.240 --> 00:32:45.595
interact with. So we tend to
accommodate so much to the

571
00:32:45.595 --> 00:32:48.795
machine that we learn so much
from the machine that we start

572
00:32:48.795 --> 00:32:51.355
start writing like machines. So
this is a challenge at the

573
00:32:51.355 --> 00:32:51.915
moment.

574
00:32:52.155 --> 00:32:54.555
<v Jason Oberholtzer>So is he
saying that because we're

575
00:32:54.555 --> 00:32:58.070
ingesting so much writing that
has been made in this process

576
00:32:58.070 --> 00:33:02.950
that we are starting to
regurgitate Yeah. That Yeah.

577
00:33:02.950 --> 00:33:03.910
That checks out.

578
00:33:04.150 --> 00:33:06.630
Tori Dominguez-Peak: That's wild
though, isn't it? Yeah. Because

579
00:33:06.630 --> 00:33:09.590
then it makes me think about
professor Fritz's class and her

580
00:33:09.590 --> 00:33:13.175
methods, like, what if people
just start writing like

581
00:33:13.175 --> 00:33:14.295
ChatchyPT?

582
00:33:14.295 --> 00:33:15.255
<v Jason Oberholtzer>Oh, boy.

583
00:33:15.255 --> 00:33:17.335
Tori Dominguez-Peak: Then at
some point, it's just gonna get

584
00:33:17.575 --> 00:33:18.695
harder to tell.

585
00:33:19.015 --> 00:33:21.575
<v Jason Oberholtzer>Alright.
You've made an un virtuous cycle

586
00:33:21.575 --> 00:33:23.895
here. I see what has happened.

587
00:33:24.135 --> 00:33:26.780
Tori Dominguez-Peak: You see
what has happened. Yeah. And so

588
00:33:26.780 --> 00:33:29.900
because Rui is a forensic
linguist, like he is the person

589
00:33:29.900 --> 00:33:33.500
you can tell who wrote what, I
was like, can you please give me

590
00:33:33.500 --> 00:33:37.420
an example of influencing the
way of like how machines

591
00:33:37.420 --> 00:33:41.405
influence the way people write.
And he said that when you speak

592
00:33:41.405 --> 00:33:45.805
to ChatGPT in Portuguese,
because he lives in Portugal, it

593
00:33:45.805 --> 00:33:51.645
will sometimes reply using a
Brazilian Portuguese dialect,

594
00:33:51.965 --> 00:33:53.645
and that has consequences.

595
00:33:53.805 --> 00:33:56.750
Rui Sousa-Silva: Yeah. One
example is the way when when

596
00:33:56.750 --> 00:33:59.790
you're writing in English, you
usually say, if you want to to

597
00:33:59.790 --> 00:34:03.470
list a set of points, you'll
say, firstly, such and such.

598
00:34:03.470 --> 00:34:09.925
Secondly, such and such. And in
Portuguese, usually you wouldn't

599
00:34:09.925 --> 00:34:14.405
use the literal pronunciation of
the adverb. But people are now

600
00:34:14.405 --> 00:34:17.685
doing that and that's because
interestingly Brazilian

601
00:34:17.685 --> 00:34:22.940
Portuguese does that and because
when you look at language

602
00:34:22.940 --> 00:34:27.100
variants, I mean in Portugal
you've got about 10,000,000

603
00:34:27.260 --> 00:34:31.740
speakers, if you go to Brazil
there are 200,000,000, so for

604
00:34:31.740 --> 00:34:35.845
generative AI engines they feed
on languages.

605
00:34:35.845 --> 00:34:39.205
So they they are more likely to
feed on Brazilian Portuguese

606
00:34:39.205 --> 00:34:40.805
Tori Dominguez-Peak: than
There's just more Brazilian

607
00:34:40.805 --> 00:34:44.565
Portuguese language data out And
so when Chateapiti speaks

608
00:34:44.565 --> 00:34:46.645
Portuguese, it sounds Brazilian.

609
00:34:46.645 --> 00:34:48.165
Rui Sousa-Silva: It's usually
Brazilian Portuguese, even

610
00:34:48.165 --> 00:34:51.300
though nowadays you can ask to
write in European Portuguese but

611
00:34:51.300 --> 00:34:54.820
every now and then there is a
word in Brazilian Portuguese for

612
00:34:54.820 --> 00:34:59.140
example. So the fact that it was
based on Brazilian Portuguese

613
00:34:59.140 --> 00:35:02.340
and the fact that Brazilian
Portuguese uses that literal

614
00:35:02.340 --> 00:35:06.265
translation of firstly,
secondly, thirdly, now people

615
00:35:06.505 --> 00:35:08.505
are writing like that.

616
00:35:08.745 --> 00:35:10.345
Tori Dominguez-Peak: Oh, that's
so interesting.

617
00:35:10.665 --> 00:35:13.145
Rui Sousa-Silva: Even native
speakers of European Portuguese

618
00:35:13.145 --> 00:35:14.665
are writing like that at the
moment.

619
00:35:14.665 --> 00:35:17.145
<v Jason Oberholtzer>Wow. As I'm
hearing that, I'm just thinking

620
00:35:17.145 --> 00:35:20.550
that that is not necessarily the
canary in the coal mine, but

621
00:35:20.550 --> 00:35:23.270
there's probably some better
metaphor for it just being the

622
00:35:23.270 --> 00:35:26.950
visible object of something that
is also happening under the

623
00:35:26.950 --> 00:35:30.230
layer of language and thinking.
If we are regurgitating

624
00:35:30.230 --> 00:35:33.350
different cultural grammar
rules, we're probably also

625
00:35:33.350 --> 00:35:37.715
surfacing other imports that we
don't know are imported from

626
00:35:37.715 --> 00:35:38.595
different places.

627
00:35:39.075 --> 00:35:41.715
Tori Dominguez-Peak: Yeah. I
mean, Brazilian adverbs are

628
00:35:41.715 --> 00:35:45.235
pretty low stakes. I mean, I
think it's kind of amusing. But

629
00:35:45.235 --> 00:35:49.070
remember when I said that
forensic linguists identify the

630
00:35:49.070 --> 00:35:50.430
DNA of language.

631
00:35:50.430 --> 00:35:50.750
<v Jason Oberholtzer>Yeah.

632
00:35:50.750 --> 00:35:54.830
Tori Dominguez-Peak: And they do
that in criminal settings. Can I

633
00:35:54.830 --> 00:35:58.110
outline a scary scenario for
you, Jason?

634
00:35:59.390 --> 00:36:00.750
<v Jason Oberholtzer>I'm braced.
I'm ready.

635
00:36:00.830 --> 00:36:04.895
Tori Dominguez-Peak: Okay. So
doctor Sosasilva told me that

636
00:36:04.895 --> 00:36:07.535
one of the things that his
colleagues talk about all the

637
00:36:07.535 --> 00:36:12.335
time is the use of AI in
criminal activity. So like using

638
00:36:12.335 --> 00:36:16.300
generative AI to impersonate
someone's writing style to write

639
00:36:16.300 --> 00:36:18.140
something incriminating.

640
00:36:18.140 --> 00:36:18.540
<v Jason Oberholtzer>Sure.

641
00:36:18.540 --> 00:36:21.100
Tori Dominguez-Peak: So for
example, somebody does not like

642
00:36:21.100 --> 00:36:26.220
you. And so they write a
strongly worded threat to a

643
00:36:26.220 --> 00:36:30.825
politician, but they write it
Jason Operholzer style. And they

644
00:36:30.825 --> 00:36:33.465
like, maybe they make a sock
puppet social media account and

645
00:36:33.465 --> 00:36:36.905
they impersonate you, and they
post it on there. And you get a

646
00:36:36.905 --> 00:36:39.785
visit from the police, and
they're like, you wrote this.

647
00:36:39.785 --> 00:36:40.265
<v Jason Oberholtzer>Yeah.

648
00:36:40.265 --> 00:36:41.865
Tori Dominguez-Peak: How do you
prove that you didn't?

649
00:36:42.345 --> 00:36:45.850
<v Jason Oberholtzer>That's a good
question. I would probably at

650
00:36:45.850 --> 00:36:52.250
this point, try to point to like
the corpus of available writing

651
00:36:52.250 --> 00:36:54.650
that I would have on hand. Like,
I'm trying to really take a

652
00:36:54.650 --> 00:36:57.055
cynical view of this and just
assume that the the

653
00:36:57.055 --> 00:36:59.455
infrastructure is weighted
against me on this one, and

654
00:36:59.455 --> 00:37:02.895
there's now a letter out there
signed by me that says, hey,

655
00:37:02.895 --> 00:37:09.135
buddy, I'm gonna kill you. And
to disprove this, I don't think

656
00:37:09.135 --> 00:37:11.950
I would have successful time
attacking the language on a word

657
00:37:11.950 --> 00:37:15.470
by word basis. I would probably
have to compile my own corpus of

658
00:37:15.470 --> 00:37:20.350
writing, and I'd probably have
to divulge repositories of data

659
00:37:20.350 --> 00:37:23.710
that I would otherwise want to
keep secret, like private

660
00:37:23.710 --> 00:37:28.225
messages and be like, I will
take all of my eye messages and

661
00:37:28.225 --> 00:37:29.265
put them in a model.

662
00:37:29.265 --> 00:37:32.145
And you can see the way that I
communicate and you can see like

663
00:37:32.145 --> 00:37:35.985
all the communication I have
had, all the cynicism I have had

664
00:37:35.985 --> 00:37:39.025
around politicians and the
government. And you tell me

665
00:37:39.250 --> 00:37:42.450
where in this trajectory is
there the leap to a murderer.

666
00:37:43.010 --> 00:37:47.090
And it is less about word choice
and more about state of mental

667
00:37:47.090 --> 00:37:50.450
well-being. Like, is this the
and then I'm going to write a

668
00:37:50.450 --> 00:37:52.930
politician and murder them
trajectory? And here's every

669
00:37:52.930 --> 00:37:55.405
piece of written correspondence
I have available to you.

670
00:37:55.405 --> 00:37:59.885
And just hope that I've got a
doctor like the good Portuguese

671
00:37:59.885 --> 00:38:02.685
doctor doctor Sosa Silva on my
side who can help me make a

672
00:38:02.685 --> 00:38:05.245
better argument about that
material than whoever's on the

673
00:38:05.245 --> 00:38:05.885
other side.

674
00:38:06.365 --> 00:38:08.580
Tori Dominguez-Peak: Yeah. I
mean, definitely, you would like

675
00:38:08.580 --> 00:38:10.900
to call doctor Sosa Silva.
Right?

676
00:38:10.900 --> 00:38:11.220
<v Jason Oberholtzer>Yeah.

677
00:38:11.220 --> 00:38:12.900
Tori Dominguez-Peak: They can
analyze the text. They're like,

678
00:38:12.900 --> 00:38:17.140
this is this isn't quite reek of
Jason. There's something just a

679
00:38:17.140 --> 00:38:18.180
little bit off of this.

680
00:38:18.180 --> 00:38:19.540
<v Jason Oberholtzer>Speaking of
word choice, can we do better

681
00:38:19.540 --> 00:38:20.580
than reek of Jason?

682
00:38:20.580 --> 00:38:21.380
Tori Dominguez-Peak: Reek of
Jason.

683
00:38:23.135 --> 00:38:25.215
<v Jason Oberholtzer>For want of a
better term, I suppose you can

684
00:38:25.215 --> 00:38:26.015
keep reek.

685
00:38:26.095 --> 00:38:28.495
Tori Dominguez-Peak: So, yeah,
this is where forensic linguist

686
00:38:28.495 --> 00:38:32.015
come in and then you don't get
arrested, hopefully. Right?

687
00:38:32.415 --> 00:38:37.695
Okay. But as generative AI gets
more sophisticated, Rui thinks

688
00:38:37.695 --> 00:38:39.920
that his work is going to get
harder.

689
00:38:40.000 --> 00:38:42.960
Rui Sousa-Silva: The
developments in generative AI

690
00:38:42.960 --> 00:38:47.520
will make it more complicated
for forensic linguists to

691
00:38:47.520 --> 00:38:52.000
attribute texts, which in turn
will mean that forensic

692
00:38:52.000 --> 00:38:55.255
linguists will need to do more
research and to further their

693
00:38:55.255 --> 00:38:59.175
research and to have more fine
grained methods of attributing

694
00:38:59.175 --> 00:39:04.135
authorship. But there will
always be a distinction between

695
00:39:04.455 --> 00:39:08.840
the way humans produce text and
the way machines generate texts.

696
00:39:08.840 --> 00:39:13.480
So things generative AI will
evolve, forensic linguistics

697
00:39:13.480 --> 00:39:19.400
will evolve, but eventually we
will always be able to pinpoint

698
00:39:19.480 --> 00:39:21.320
differences between the texts.

699
00:39:21.905 --> 00:39:25.105
<v Jason Oberholtzer>So very
similarly to the classroom here,

700
00:39:25.105 --> 00:39:27.825
this seems like it is just
creating piles of work for

701
00:39:27.825 --> 00:39:28.305
everybody.

702
00:39:28.305 --> 00:39:30.225
Tori Dominguez-Peak: Yeah. It
seems like things are just gonna

703
00:39:30.225 --> 00:39:34.225
get harder for everyone, which
is kind of a bummer. And I know,

704
00:39:34.225 --> 00:39:36.850
like, the threatening the
politician, like, that's a very

705
00:39:36.850 --> 00:39:40.130
dramatic example, obviously. But
like you said, with the

706
00:39:40.130 --> 00:39:45.010
classroom, the idea I keep
coming back to is cognitive

707
00:39:45.010 --> 00:39:50.275
offloading and like, not doing
all these mental processes and

708
00:39:50.275 --> 00:39:53.395
offloading it to AI, which then
I guess makes it harder for

709
00:39:53.395 --> 00:39:58.835
forensically. Like, there's been
studies from Microsoft, from the

710
00:39:58.835 --> 00:40:04.355
SBS, Swiss Business School about
how people who use generative AI

711
00:40:04.480 --> 00:40:08.880
regularly tend to score lower on
markers of critical thinking.

712
00:40:08.880 --> 00:40:12.880
Like, there's actual data we
have now. Mhmm. Again, I'm

713
00:40:12.880 --> 00:40:16.320
painting kind of a scary picture
to you. It's like, okay, so our

714
00:40:16.320 --> 00:40:20.965
critical thinking may be getting
compromised by a technology that

715
00:40:20.965 --> 00:40:24.165
is also getting more
sophisticated at pretending to

716
00:40:24.165 --> 00:40:29.605
be us. And we're also starting
to become influenced by the way

717
00:40:29.605 --> 00:40:30.485
it writes.

718
00:40:30.565 --> 00:40:33.780
Mhmm. Like, that's just such a
weird trifecta.

719
00:40:34.100 --> 00:40:35.860
<v Jason Oberholtzer>Right. But
doesn't the cycle also work in

720
00:40:35.860 --> 00:40:39.140
the other direction? Like, we
are the corpus of information

721
00:40:39.140 --> 00:40:43.380
that the generative models need
to continue their work. And as

722
00:40:43.380 --> 00:40:46.515
we lose cognitive function
because of offloading, the

723
00:40:46.515 --> 00:40:50.755
material that we are able to
feed depreciates in value as

724
00:40:50.755 --> 00:40:55.075
well, which one imagines leads
to worse outputs from machines,

725
00:40:55.075 --> 00:40:58.115
which we are synthesizing and
further inhibits our ability to

726
00:40:58.115 --> 00:41:01.730
think and provide a reasonable
corpus of information updated to

727
00:41:01.730 --> 00:41:04.370
the moment from which Yeah. The
models can select.

728
00:41:04.370 --> 00:41:06.290
Tori Dominguez-Peak: It's kind
of like this feedback loop.

729
00:41:06.450 --> 00:41:11.890
Yeah. Us feeding it and then it
influencing us, and then all of

730
00:41:11.890 --> 00:41:14.210
a sudden, we're just all
speaking Brazilian Portuguese,

731
00:41:14.725 --> 00:41:15.365
right?

732
00:41:15.525 --> 00:41:16.085
<v Mike Rugnetta>Yeah.

733
00:41:16.405 --> 00:41:18.085
Tori Dominguez-Peak: And getting
accused of crimes we didn't

734
00:41:18.085 --> 00:41:23.205
commit. And professor Fritz's
concern about the feedback loop

735
00:41:23.205 --> 00:41:26.325
is that it affects everyone
differently.

736
00:41:26.980 --> 00:41:31.540
<v Megan Fritts>A lot of people
defend, letting students use AI

737
00:41:31.540 --> 00:41:35.940
in their work, by saying that
they see it as a tool for

738
00:41:35.940 --> 00:41:41.115
equity. Maybe for students who,
had a less privileged primary

739
00:41:41.115 --> 00:41:44.715
education or students for whom
English is a second language, I

740
00:41:44.715 --> 00:41:48.235
would contend the exact opposite
is true. That what this is doing

741
00:41:48.235 --> 00:41:51.755
is setting the stage for genuine
reading and writing skills

742
00:41:51.755 --> 00:41:56.090
becoming something that is
really only accessible to the

743
00:41:56.090 --> 00:42:00.650
elite class, those with a lot of
money and leisure time to

744
00:42:00.650 --> 00:42:03.610
cultivate them intentionally.
And so that's really something

745
00:42:03.610 --> 00:42:05.130
that concerns me quite a bit.

746
00:42:05.130 --> 00:42:06.890
<v Jason Oberholtzer>Okay. I'm
beginning to see why there's

747
00:42:06.890 --> 00:42:09.865
such difficulty forming
consensus around this. I mean,

748
00:42:09.865 --> 00:42:13.385
initially, I'll be honest, my
reaction was yeah. It's obvious

749
00:42:13.385 --> 00:42:15.305
that in a space where you're
supposed to be practicing

750
00:42:15.305 --> 00:42:20.185
thinking and metastasizing your
own thoughts that AI is just not

751
00:42:20.420 --> 00:42:23.540
helpful. It's not there for any
reason except for you to finish

752
00:42:23.540 --> 00:42:26.020
the paper, which is a
representation of the thoughts

753
00:42:26.020 --> 00:42:28.340
you were supposed to be having,
and it is like the wrong

754
00:42:28.340 --> 00:42:30.260
takeaway from what you're doing
in the classroom.

755
00:42:31.060 --> 00:42:34.515
But now that they're applying
these frameworks around

756
00:42:34.515 --> 00:42:38.435
accessibility, I can see it
becoming a little more

757
00:42:38.435 --> 00:42:41.155
complicated. I guess a lot of it
boils down to how much you think

758
00:42:41.155 --> 00:42:44.915
the role of the academy is to
prepare you for work,

759
00:42:45.390 --> 00:42:49.710
employment, and how much it is
to help you engage with how you

760
00:42:49.710 --> 00:42:55.790
think and learn. But I'm
beginning to see why the stakes

761
00:42:55.790 --> 00:42:58.670
are a little more complicated
than perhaps they feel at first

762
00:42:58.670 --> 00:42:58.910
blush.

763
00:42:59.405 --> 00:43:01.645
Tori Dominguez-Peak: Yeah. And
then when you think about

764
00:43:01.805 --> 00:43:05.565
professor Fritz's role in the
committees and how it's just

765
00:43:05.565 --> 00:43:10.045
really hard to come to agreement
or make any sort of policy, I

766
00:43:10.045 --> 00:43:12.790
think this is something that
they're gonna continue to

767
00:43:12.790 --> 00:43:14.630
wrestle with for a long time.

768
00:43:26.070 --> 00:43:30.085
<v Megan Fritts>As for myself, my
policy won't be changing. And

769
00:43:30.165 --> 00:43:33.205
that's, you know, that's about
all I can do about that.

770
00:43:44.290 --> 00:43:48.290
<v Jason Oberholtzer>Tori, thank
you as always for bringing in a

771
00:43:48.290 --> 00:43:50.610
really insightful piece here.

772
00:43:51.170 --> 00:43:52.850
Tori Dominguez-Peak: Yeah.
Thanks so much for letting me

773
00:43:52.850 --> 00:43:53.730
talk about it.

774
00:43:53.730 --> 00:43:54.210
<v Jason Oberholtzer>Yeah.

775
00:43:54.690 --> 00:43:56.850
Tori Dominguez-Peak: And I just
wanted to shout out thanks to

776
00:43:56.850 --> 00:44:00.025
professor Fritz and to doctor
Sosa Silva.

777
00:44:00.265 --> 00:44:02.025
<v Jason Oberholtzer>At least one
of whom is probably going to be

778
00:44:02.025 --> 00:44:05.545
getting me out of jail over the
next couple years. So pre thank

779
00:44:05.545 --> 00:44:06.505
you for that one.

780
00:44:06.665 --> 00:44:08.105
Tori Dominguez-Peak: Pre thank
you for that one.

781
00:44:09.465 --> 00:44:12.505
<v Jason Oberholtzer>Tori, where
can folks find you and all of

782
00:44:12.505 --> 00:44:15.760
the writing that definitively
comes from your own brain on the

783
00:44:15.760 --> 00:44:16.320
web?

784
00:44:16.400 --> 00:44:18.320
Tori Dominguez-Peak: You can
find me at toori d p nine

785
00:44:18.320 --> 00:44:24.800
8.bluesky.social. And you can
also find my podcast about video

786
00:44:24.800 --> 00:44:29.965
games that I make of my own
brain and play with my own brain

787
00:44:30.365 --> 00:44:34.365
at
press-startpod.bluesky.social.

788
00:45:15.020 --> 00:45:18.300
<v Jason Oberholtzer>Tuesday,
September 23, 10:30AM.

789
00:47:35.515 --> 00:47:37.435
<v Mike Rugnetta>That is the show
we have for you this week. We're

790
00:47:37.435 --> 00:47:40.460
gonna be back here in the main
feed on Wednesday, October 8. We

791
00:47:40.860 --> 00:47:45.980
are proud and thankful and
extremely lucky to have the

792
00:47:45.980 --> 00:47:50.620
member community that we do.
Without the support of our

793
00:47:50.620 --> 00:47:56.515
members, this show would not and
could not exist. So I just wanna

794
00:47:56.515 --> 00:47:58.275
say thank you.

795
00:47:58.915 --> 00:48:02.515
If you would like to become a
NeverPost member and join this

796
00:48:02.515 --> 00:48:06.690
community for as little as $4 a
month, you can do that at

797
00:48:06.690 --> 00:48:11.890
neverpo.st. Where also if a
membership is a little too big

798
00:48:11.890 --> 00:48:15.970
of a commitment in these strange
and trying times, you can also

799
00:48:15.970 --> 00:48:20.475
tip us a one time any dollar
amount, and we promise that we

800
00:48:20.475 --> 00:48:26.155
will spend every last red cent
on arcade games chewing gum and

801
00:48:26.155 --> 00:48:39.730
baseball cards. Become a member
at neverpo.st. Never Post's

802
00:48:39.730 --> 00:48:41.890
producers are Audrey Evans,
Georgia Hampton, and the

803
00:48:41.890 --> 00:48:44.615
mysterious, doctor first name,
last name. Our senior producer

804
00:48:44.615 --> 00:48:45.575
is Hans Buto.

805
00:48:45.575 --> 00:48:48.615
Our executive producer is Jason
Oberholzer, and the show's host,

806
00:48:48.615 --> 00:48:55.495
that's me, is Mike Rugnetta. And
then this warning flashes on the

807
00:48:55.495 --> 00:48:59.510
light meter. Inside the house, a
pilot light is always burning in

808
00:48:59.510 --> 00:49:04.230
the oven's eyes. The low roof is
pulled down over the eyes like a

809
00:49:04.230 --> 00:49:07.350
hat. And underneath the
warnings, light motif networks

810
00:49:07.350 --> 00:49:12.625
of subterranean lines run like
the nervous system or bloodlines

811
00:49:12.865 --> 00:49:16.145
or fractures spreading from
tectonic lines of fault.

812
00:49:16.865 --> 00:49:20.065
In distant coasts, heavy and
light petroleum is piped across

813
00:49:20.065 --> 00:49:25.105
state lines and gas, electric,
oil, and water lines convey

814
00:49:25.105 --> 00:49:31.270
their vital humors to the house.
Excerpt of nervous systems by

815
00:49:31.270 --> 00:49:35.430
Greg Williamson. Never Post is a
production of charts and leisure

816
00:49:35.430 --> 00:49:37.990
and is distributed by
Radiotopia.