Episode 58

Understanding Digital Wellness: Part 2

This podcast episode delves into the pressing concerns regarding children's safety in the digital realm, particularly in relation to video games and social media platforms. We emphasize the alarming prevalence of online predators and the importance of vigilance when it comes to the games children engage with, specifically highlighting popular titles such as Roblox and Minecraft. Furthermore, we explore the implications of data permanence in the age of social media, where once-shared content may perpetually exist, potentially leading to unwanted repercussions. We advocate for a cautious approach to sharing images of children online, stressing the need to protect their identities. Ultimately, we reflect on the broader societal impacts of technology, urging listeners to prioritize holistic wellness and maintain a discerning attitude towards their digital interactions.

Takeaways:

  • The prevalence of video games like Roblox and Minecraft poses significant risks to children, as they are frequently associated with online predators.
  • It is essential to recognize that social media perpetuates a false sense of reality, leading to detrimental effects on mental health and self-esteem among users.
  • Parents should exercise caution regarding their children's online presence, opting not to share their children's images on social media to protect their identities.
  • Data generated on social media platforms is permanent and can be misused, highlighting the importance of safeguarding personal information on the internet.
  • The rise of artificial intelligence has profound implications for society, particularly in the realms of privacy, identity theft, and the creation of deepfakes.
  • Individuals must remain vigilant about the data they share online, as it can be exploited for various malicious purposes, including financial fraud and reputational damage.

Companies mentioned in this episode:

  • Roblox
  • MySpace
  • Minecraft
  • Instagram
  • Snapchat
  • Tick Tock
  • Hulu
  • Disney
  • 2 3 and me

Support Us

Savannah IG: Holyistic_Wife

Jeremiah IG: Holyistic_Husband

Boy: Atlas Haiku 🐾Girl: Hollie Scypher: https://www.instagram.com/holyistic_pups/?hl=en

Facebook: https://www.facebook.com/people/Casting-Seeds-Podcast/61557099641711/?mibextid=ZbWKwL

Email: CastingSeeds.Podcast@gmail.com

Remember to subscribe, share, and leave a review if you find this episode valuable. Connect with us on social media to join the ongoing conversation.

Check out our Amazon Store

https://www.amazon.com/shop/holyistic_wife/list/3J3IEE9W1QB8T?ref_=cm_sw_r_apin_aipsfshop_aipsfholyistic_wife_9JJX9D2TN3ZVH0NDZ3FX&fbclid=PAZXh0bgNhZW0CMTEAAaayiYOzckqMItDHQpQpr19uGX3PpXgaRF9sy09UmxYjDYV87vrCK59VaSw_aem_fQittPh8jGgsYt5OeQs5IQ

Transcript
Speaker A:

Foreign.

Speaker B:

You know, there's.

Speaker B:

If your child is playing video games, I would look up on the Internet.

Speaker B:

What video games are ranked for predators?

Speaker A:

Oh, Roblox, I know, is one of them.

Speaker B:

You know, Roblox is the number one video game for children in the world.

Speaker A:

Yeah.

Speaker B:

And there's on mobile and consoles.

Speaker A:

I don't even know what it is.

Speaker A:

I just know it's like one.

Speaker B:

It's like Minecraft.

Speaker A:

Well, that's also.

Speaker A:

That's also a bad one.

Speaker B:

Yeah, Minecraft.

Speaker B:

Minecraft has a lot of predators on it.

Speaker A:

I don't.

Speaker A:

I don't really know many games, so.

Speaker B:

So the online ones, I mean, it wasn't as bad when I was a kid, but it's gotten a lot worse.

Speaker B:

And so one thing about social media.

Speaker B:

Social media is constantly recorded.

Speaker B:

And a lot of people don't think about this, but your data is not being deleted.

Speaker B:

It is saved.

Speaker B:

Whatever you have said, whatever photo you have posted, it will never go away.

Speaker B:

It'll never go away.

Speaker A:

I remember my parents saying that to me when I, like, made a MySpace and I posted like a photo with kind.

Speaker A:

Like I was, you know, a teenager and it was more booby.

Speaker A:

And they're like, you have to take that down.

Speaker A:

That's never gonna.

Speaker A:

Like, you already had it up, but delete it right now.

Speaker A:

Like, that's never gonna go away.

Speaker A:

And that concept just didn't make sense to me as a teenager.

Speaker B:

Your.

Speaker B:

I think this is going to be a good segue into data from this point.

Speaker B:

And this is really.

Speaker B:

It's really sad because you're posting things to share with your family and to be loving and kind, and you post a photo of your son or daughter and somebody steals it.

Speaker B:

Now it is considered child pornography.

Speaker A:

Yeah.

Speaker A:

So many people are doing that.

Speaker A:

Jeremiah and I actually already decided, which that's going to be a discussion we have to have with our families, that we do not want our child's face on social media.

Speaker B:

No.

Speaker A:

At all.

Speaker A:

If people take a photo and they want to post it, they have to put like a little heart emoji or something.

Speaker B:

Yeah.

Speaker A:

To protect our child's identity.

Speaker A:

Not because of, like our show or that we're famous in any way, but our child should get to a point where then they feel comfortable to be able to post themselves on the Internet and share that data when they want to.

Speaker A:

But especially with AI and a bunch of other things like that.

Speaker B:

Yeah.

Speaker A:

There are scary things happening.

Speaker A:

And before that, people just honestly even like, they'll have.

Speaker A:

I've known people who have had private profiles who still their children's identities will be taken and they'll pretend like people will pretend to have families.

Speaker A:

People will sell those images like sexually.

Speaker A:

Especially if like a child's in a bathing suit or has their shirt off or you know, whatever.

Speaker A:

It is scary out there.

Speaker B:

Yeah.

Speaker B:

And there's people jumping into people's DMs.

Speaker B:

That's didn't just happened.

Speaker B:

That is constant, never gonna go away.

Speaker B:

Instagram now has the ability to hide your conversations.

Speaker A:

Yep.

Speaker A:

Just like Snapchat.

Speaker B:

Platforms like Instagram, Tick Tock, they create unrealistic expectations for people also.

Speaker B:

And it starts to lead people down a.

Speaker B:

A low self esteem, get envious depression and people start comparing their lives to highlights and reels and what people are posting.

Speaker B:

I mean it was bad enough when like the Kardashians and all those people started having their TV shows.

Speaker B:

It's even worse with this because now it feels like it's more attainable because now all I have to do is become social media famous and yeah, it's horrible.

Speaker A:

Yeah.

Speaker B:

And so another thing is like we have, it's a distraction.

Speaker B:

It's all designed to be a distraction.

Speaker B:

And those distractions we should turn into prayers and create spiritual growth instead and just have endless notifications and mindless scrolling stealing time away from our prayer and like our Bible time and being with God.

Speaker B:

We should really reflect and think about how much time we spend aimlessly and how much effort we put into the world.

Speaker B:

That is not really going to do anything for anybody.

Speaker A:

Yeah.

Speaker A:

If you're on there and you're not being a time waster, you know, if you're.

Speaker A:

That's why when I'm on there, if I'm gonna scroll and I'm gonna see something bad pop up, I'm like, well, I'm gonna use my voice to speak out for Christ.

Speaker A:

And I know a lot of people, again they think that's silly, but it makes it where then I'm actually like, I share the gospel at least five times a day while I'm on social media.

Speaker B:

Well, it's not safe out in the world lately.

Speaker B:

Well before a lot of things politically have changed, it was not safe to go talk about God.

Speaker B:

There's a huge target depending on where you are.

Speaker B:

Yeah.

Speaker B:

And so I think it's, it's one way to speak up and also share.

Speaker B:

And the thing is, is like you're not only just talking to that one person, talking to everyone else that's in that comment.

Speaker A:

Yeah.

Speaker A:

People can see and even if someone doesn't comment respond, they can still read and, and have that planted in their heart, which is really good.

Speaker A:

I also want to ask for social media specifically, cuz I see it all the time, especially on Facebook.

Speaker A:

Like older people do not understand.

Speaker A:

So many videos and images are created on AI.

Speaker B:

Oh yeah.

Speaker A:

So like AI and I know you talked about like information control and stuff.

Speaker A:

Like how does that work with like the algorithm and all those thingies?

Speaker B:

AI inform.

Speaker B:

Well, first I want to jump, rewind just one, one step because I wanted to talk about the dopamine addiction that social media has now created.

Speaker B:

Like video games.

Speaker B:

Video games created this ideal adventure for people that are scared to go and do things and successful in their life and they are able to jump in a video game and they develop an adventure and they be.

Speaker B:

They find success in a video game.

Speaker B:

Quick fix.

Speaker A:

Yeah, quick success.

Speaker A:

That makes sense.

Speaker A:

And even a.

Speaker A:

Like, like a, like on a photo.

Speaker B:

Social media.

Speaker B:

Yeah, same exact thing.

Speaker B:

You get that comment?

Speaker B:

You get that, like, that's why I.

Speaker A:

Turned off all the likes on, like, I don't see likes that other people get.

Speaker A:

And all of mine, I don't let people see.

Speaker B:

I just, I want to put that in there.

Speaker B:

And then also most people that are on social media and watch like YouTube videos and are on video games, they have a really short attention span.

Speaker B:

If they lose attention, that's it.

Speaker B:

And, and that's, that's another thing that's a really big thing.

Speaker A:

If people listen to our podcast, they have good attention spans.

Speaker B:

Yeah.

Speaker B:

At least they're practicing.

Speaker A:

Good job, everybody.

Speaker A:

I'm so proud of you.

Speaker B:

If you've made it this far, congratulations.

Speaker B:

Okay, now let's jump back into AI.

Speaker A:

Okay.

Speaker B:

AI it is not artificial intelligence, first off.

Speaker A:

Oh, okay.

Speaker A:

I'm so glad you're going.

Speaker B:

AI is not artificial intelligence.

Speaker B:

It is an algorithm.

Speaker B:

It is a bunch of.

Speaker B:

Let's just say you're going to take a math test and on the computer your math teacher has put a bunch of math problems.

Speaker A:

Like possibilities.

Speaker B:

No, no.

Speaker B:

Just like 1 plus 1, 2 plus 2, 3 plus 3.

Speaker B:

Okay.

Speaker B:

Now when you go in and you answer the question, it is learning from you in the sense that, oh, this person knows the answer to this or needs help with this after this is done.

Speaker B:

Now I know you want to talk about math, so now I'm going to.

Speaker B:

You ask about logarithms or you ask about geometry.

Speaker B:

It knows that.

Speaker B:

Now I'm pulling this database in and now this database is going to answer your question.

Speaker A:

So what you're saying is that AI is.

Speaker A:

What does it actually stand for?

Speaker A:

Then would it be algorithm something?

Speaker B:

It is a library of algorithms that are developed to answer questions and respond.

Speaker A:

So it's more of like a humongous database.

Speaker A:

And then if you are asking, like, I need help with poems or whatever, or it then goes into that database and then finds all the rhyming tools and then does it.

Speaker A:

So it's like a fast computer.

Speaker B:

It's a fast computer with solutions, but.

Speaker A:

It can't technically make up.

Speaker A:

Oh, is this why when people were creating images and songs through AI, they technically still had the rights to it?

Speaker A:

Because AI technically can't create anything.

Speaker A:

I heard about the someone suing an AI company for that.

Speaker B:

Two things.

Speaker B:

AI is pulling from the Internet, so it's artificially grabbing all this data off the Internet, all these different databases.

Speaker B:

Okay, let's talk about music.

Speaker A:

Including your data, then.

Speaker B:

Yeah.

Speaker B:

It's pulling all your music from every single artist and then it's taking what you told it to do, and it's going to add all of that data to this data.

Speaker B:

So it's creating a new song technically, but it's stealing it from everybody.

Speaker A:

Oh, my gosh.

Speaker A:

Yeah.

Speaker A:

Because then technically, like, T Swizzle could have a percentage of that song.

Speaker A:

Yeah.

Speaker A:

Or Bass Nectar.

Speaker A:

I try to think of super random Chopin.

Speaker B:

Yeah, well, Chopin's dead.

Speaker A:

Well, I know, but you know, who owns his rights?

Speaker B:

Nobody.

Speaker A:

Okay.

Speaker B:

Anyway, after a certain amount of time, nobody owns the rights.

Speaker B:

That's why Disney doesn't.

Speaker B:

Well, I think after this year, will no longer own the rights to Mickey Mouse.

Speaker A:

Actually, that already happened.

Speaker B:

Oh, it did?

Speaker B:

Oh, there you go.

Speaker A:

Interesting.

Speaker A:

Okay, well, yeah, so.

Speaker A:

Well, and that.

Speaker A:

And like, there's so much deception, too, with AI.

Speaker A:

I've seen your family from Europe, the elderly side, literally, like, there would be a pup, like a dog, where it's so obvious.

Speaker A:

Like, the puppy's feet are, like, super messed up.

Speaker A:

And it's like, so AI.

Speaker A:

And they're like, this is Chernobyl, babe.

Speaker B:

That's nuclear waste.

Speaker A:

No, but they're like.

Speaker A:

They're like, this is the cutest dog.

Speaker A:

Congratulations on your new puppy.

Speaker A:

And it's like, that is an AI generated image.

Speaker B:

Yeah.

Speaker A:

I mean, like, even the farmer, like, his nose is like, we like, you know what I mean?

Speaker A:

And some of them are, like, getting so scary.

Speaker A:

Scarily accurate that they're not fully messed up.

Speaker A:

And images, you have to, like, really have an eye for it.

Speaker A:

Like, how.

Speaker A:

How if you're not a tech person, how do you even start?

Speaker A:

Do you just have to practice and look up?

Speaker B:

You have to be observant you have to know what AI struggles with creating.

Speaker A:

Like.

Speaker B:

And look for that.

Speaker B:

Hands.

Speaker B:

Yes.

Speaker A:

Yeah.

Speaker A:

Spaghetti.

Speaker A:

For some reason, crowds videos, they still struggle with.

Speaker B:

And so I think one of the, one of the hard things I think is going to be in the next coming years is going to be telling the difference between reality and what's fake.

Speaker B:

For sure.

Speaker A:

Yeah.

Speaker B:

And I think one of the biggest issues with AI is a lot of people are saying AI is God.

Speaker B:

Now.

Speaker A:

Speaking of that, you want to hear a fun fact?

Speaker A:

There are quite a few Christians who are going on and asking AI if you were to pick one religion in the world, Christianity.

Speaker A:

Okay.

Speaker B:

I've asked too.

Speaker B:

You don't think I.

Speaker A:

You're gonna let me finish?

Speaker B:

I did it too.

Speaker B:

Okay.

Speaker A:

Okay.

Speaker A:

But here's why.

Speaker A:

So they would ask AI specifically, if you could pick any religion in the world to actually be true, which one would it be and why?

Speaker A:

And you have to use a one word answer.

Speaker A:

They're like, keep it specific, one word answer.

Speaker A:

And every single time it will say Christianity.

Speaker A:

And when they ask why, like, give me a detailed answer now, like, as.

Speaker B:

To what historical facts.

Speaker A:

Yeah, they said it's the only one that is historically accurate, like through and through, where it has more people who witnessed the, like Jesus rising from the dead from different accounts that were separated over thousands of years.

Speaker A:

The most prophecies ever finished.

Speaker A:

Like, they're all the things that we say as Christians to show that scientifically the Bible is accurate.

Speaker A:

Just the Bible alone, let alone extracurriculars outside of the Bible that talk about Christ and talk about the Bible and talk about the Word.

Speaker A:

And it was, it's so cool.

Speaker A:

Like, it's crazy how literally creation itself, because AI or you know, the algorithm, humans, even humans who don't believe in it, have affected the Internet so much that the actual data shows that Christianity is legitimate.

Speaker B:

Yeah.

Speaker A:

That's the only cool thing that I could really say.

Speaker A:

Thank you.

Speaker A:

AI Fake artificial intelligence.

Speaker B:

AI also agrees that the planet Earth and all of our universe, it cannot.

Speaker B:

It has a creator.

Speaker A:

Yeah.

Speaker A:

It has to be created.

Speaker A:

That's true.

Speaker A:

Yeah.

Speaker A:

That was one of the things I said too.

Speaker B:

Yeah.

Speaker A:

So that's a little fun fact about AI that I was excited to bring, but I guess Jeremiah literally beat me to the punch on that one.

Speaker B:

Yeah.

Speaker B:

As soon as I found out chat GPT was a thing, I was like, I was like just asking all these questions and trying to figure it out.

Speaker A:

I can think of a few people who now I want them to look that up.

Speaker A:

I won't say if you're listening.

Speaker A:

You know who you are.

Speaker B:

And so a lot of.

Speaker B:

A lot of these different technologies that have came come into our life in the last 20, 30 years have now created this.

Speaker B:

This world that has a lot of isolation and, oh, yeah, relation that.

Speaker B:

There's just so many people are disconnected relationally from people, and they choose to live in a video game or have a sin.

Speaker B:

It's a sim.

Speaker B:

Sim for a girlfriend.

Speaker A:

Japanese even know what that means.

Speaker B:

Japan has a pandemic of men that prefer to date a artificial intelligence woman on their phone.

Speaker A:

So data.

Speaker B:

Data.

Speaker B:

Yeah, they choose to date that versus data woman.

Speaker A:

And so data woman.

Speaker A:

Sorry, they literally have a data woman.

Speaker B:

Yeah, they have a data woman instead of dating a woman.

Speaker A:

So funny, but also sad.

Speaker A:

Yeah, that's crazy.

Speaker B:

Yeah.

Speaker A:

And.

Speaker A:

But don't they want, like, the physical touch, or do they have, like, those weird.

Speaker A:

I've heard that there are weird online, like bodysuits and things that.

Speaker A:

This may sound a little graphic.

Speaker A:

So if they're little ones, maybe pause, you know, if you're playing this out loud.

Speaker A:

Um, but they.

Speaker A:

And I only know about this, by the way, because I tried to do a little bit of research, preparing for this podcast, I hope.

Speaker A:

And also the pornography one that we did, I read about that for the first time then over a year ago, I think whenever we did that, there are suits, and it was originally created for couples who were, like, far away from each other, where, like, if I touch my thigh or you touch your thigh, I can feel it on my thigh.

Speaker A:

Type of a thing.

Speaker B:

Yeah.

Speaker A:

And then obviously same thing with, you know, sexual organs.

Speaker A:

But I've heard that they've used that, and then they've been able to connect that online to, like, a virtual boyfriend or girlfriend.

Speaker B:

Yeah.

Speaker A:

Not even a husband or wife.

Speaker A:

It literally has to be a boyfriend or girlfriend, which is stupid.

Speaker A:

And then they have types of intercourse with these games or with these quote, unquote, AI data girlfriends or that.

Speaker A:

And it's mostly girls.

Speaker A:

They don't make them for women.

Speaker A:

Women don't really want these things.

Speaker A:

They want actual physical men.

Speaker B:

Interesting.

Speaker B:

I don't know.

Speaker A:

Yeah, that's.

Speaker A:

Well, because I saw it.

Speaker A:

I.

Speaker A:

The reason why I looked it up again is because when I saw it in the porn thing, and then I was like, I want.

Speaker A:

It must have advanced.

Speaker A:

Like, it's been over a year since we've had that episode.

Speaker A:

And it has.

Speaker A:

They have crazy.

Speaker A:

And also they have those, like, chat rooms where people dress up.

Speaker A:

It's mostly furries, which is kind of weird, but they dress up as like, animals and stuff.

Speaker A:

And then they're in these chat rooms.

Speaker A:

And I actually watch a guy on Instagram.

Speaker A:

He's so funny because he goes in and he calls people out for, like, being predators on there or being inappropriate and like, having weird virtual sex.

Speaker A:

And they'll be like, hey, can you please not be close to me?

Speaker A:

I have a phantom suit on.

Speaker A:

I can feel you.

Speaker A:

Or when you walk through me, I can feel it in my chest.

Speaker B:

Interesting.

Speaker A:

I know.

Speaker B:

Yeah.

Speaker A:

So, yeah, loss of real connection.

Speaker B:

Yeah, there's.

Speaker B:

There's a lot.

Speaker B:

There's a lot going on in the world.

Speaker B:

There's a document doc.

Speaker B:

Document documentary on Hulu right now, Remember about the.

Speaker B:

Those guys that are like, they're just antisocial and they have this perspective on women and they just don't like women because they all agree socially on the Internet that they don't like women.

Speaker A:

Yeah.

Speaker A:

They're now like a type of terrorist, right?

Speaker B:

Yeah, they're considered a terrorist now.

Speaker B:

Terrorist organization in certain countries.

Speaker B:

I don't remember what they're called.

Speaker A:

They're like insulids or.

Speaker B:

Yeah, something.

Speaker A:

But it's weird.

Speaker A:

Yeah, they're.

Speaker A:

They've been burned by women or they've never been able to get a date or whatever.

Speaker A:

And so then they start.

Speaker A:

They hate all women and they literally will like, kill them and stuff.

Speaker B:

Yeah, there's.

Speaker B:

There's been some stuff.

Speaker A:

There are women versions of it, too, that dislike men.

Speaker B:

Well, no, they dislike women because they think that women are way more attractive to them than them.

Speaker B:

So they rather be against women and for themselves.

Speaker A:

That's a different one.

Speaker A:

But yeah, there's ones that hate men too.

Speaker B:

Yeah.

Speaker B:

People are crazy without God.

Speaker A:

Yeah.

Speaker A:

Well.

Speaker A:

And so all of that being said, all of this technically boils down to data on the Internet, right.

Speaker A:

And the way that it's used.

Speaker A:

Can you, like, explain what data actually is?

Speaker B:

Data is all personal information that.

Speaker B:

Well, I guess in.

Speaker B:

In reference to what we're talking about as data for individuals, it's our personal information on the Internet.

Speaker B:

On the Internet.

Speaker A:

So is data in general just all information on the Internet then?

Speaker B:

Yes.

Speaker A:

Okay.

Speaker B:

Yeah.

Speaker B:

Data could be transferred through the Internet, through a fax machine, through a phone call, text message.

Speaker A:

And it's all traceable?

Speaker B:

All traceable.

Speaker B:

It's all recorded.

Speaker A:

Crazy.

Speaker B:

I think the only thing that's not recorded will be transistent radios.

Speaker B:

Like, if you're on a walkie talkie, that's probably the only thing that's not being recorded.

Speaker A:

But radios still have their own frequency that can still Be harmful to people to some degree.

Speaker A:

Correct.

Speaker B:

I guess.

Speaker B:

Like at a very high or very low frequency.

Speaker B:

I'm sure there's something in there.

Speaker A:

Yeah.

Speaker B:

I haven't done the research for that.

Speaker A:

So here's my question for data in general.

Speaker A:

Because I've heard so many people say, especially in my.

Speaker A:

Even in the holistic world, they're like, there are so many physical, tangible things that could kill me.

Speaker A:

I literally don't care what's on the Internet.

Speaker A:

Why should someone care if their data is stolen or if someone else is using their data or if other people can read their data or explore their data?

Speaker A:

Like, why would someone care?

Speaker B:

So one reason would be identity theft, financial loss, privacy concerns, reputation and damage, cybersecurity threats, legal and regulatory implications, psychology impact, psychological impact.

Speaker A:

And so that can all happen from data.

Speaker A:

People stealing your data.

Speaker B:

Yeah.

Speaker B:

I mean, so like, for psycho psych.

Speaker A:

Psychological.

Speaker B:

Yes.

Speaker B:

Thank you.

Speaker B:

An impact from that.

Speaker B:

There's.

Speaker B:

There's a hack going on right now.

Speaker B:

People are.

Speaker B:

There's a threat out there that has a phone call that calls you, records you, say hello, and then hangs up immediately it gets your voice and it's recording your voice every single time it calls you back.

Speaker B:

And it's going to impersonate.

Speaker B:

It's going to call your mom, it's going to call your dad, pretend to be you, and it's going to say, send me money now I'm.

Speaker B:

Or I.

Speaker B:

I'm in jail.

Speaker B:

Can you come help me?

Speaker A:

Yeah.

Speaker A:

Or I've been abducted.

Speaker A:

Or I've heard that.

Speaker B:

Yeah.

Speaker B:

And it is an actual thing.

Speaker A:

Yeah.

Speaker B:

And it's a huge threat that causes huge distrust between you and your family member, you.

Speaker B:

And wanting to ever answer that phone call again.

Speaker A:

Yeah.

Speaker A:

Creating code words and things like that would be really important.

Speaker B:

Yeah.

Speaker B:

Legal say I.

Speaker B:

They do the exact same thing and they call the FBI and be like, I have a bomb.

Speaker B:

I want to do this.

Speaker A:

Oh, and represent you.

Speaker B:

Yeah.

Speaker B:

Or AI.

Speaker B:

They can take my photo, use AI and make it look like I killed somebody.

Speaker A:

Like transpose the image of your face on a video.

Speaker B:

You could already do that with green screens.

Speaker A:

Wow.

Speaker B:

But now with AI, anybody could technically do it.

Speaker B:

As long as the technology keeps advancing with voice, it's gonna get worse.

Speaker A:

You know what I heard a lot of people are who are like, in theft, who are like stealing things.

Speaker A:

They're wearing a fake extra ring finger or like pinky or thumb on their hand, like a fake one.

Speaker A:

And then when they steal things, they make that really evident while they're stealing.

Speaker A:

And they'll Say, look, this is AI generated.

Speaker A:

It's not real.

Speaker A:

Because my hand is messed up.

Speaker A:

I clearly only have five fingers on my left hand, and that shows six or seven.

Speaker B:

Yeah.

Speaker A:

Which is in.

Speaker A:

So like, people in real life are messing with it, which is crazy.

Speaker B:

Yeah.

Speaker B:

So then like, cybersecurity threats.

Speaker B:

Cybersecurity is not normally targeted to one individual person.

Speaker B:

It is targeted to a single company.

Speaker B:

So let's say, like, you're using hbo.

Speaker B:

They hack HBO and they steal everybody's credit card off of whoever has a membership with them.

Speaker A:

Oh, yeah.

Speaker A:

People don't think about credit cards being a part of their data.

Speaker B:

Yeah.

Speaker A:

Or private health information.

Speaker B:

Private health information is a huge one.

Speaker B:

You don't think it's so relevant just because it's something that is like, normally mailed to you.

Speaker B:

You go pick up a hard copy and stuff like that.

Speaker B:

But the thing is, is like this data is saved on a server.

Speaker A:

Oh, yeah.

Speaker A:

People could steal your health insurance.

Speaker A:

They can.

Speaker A:

Wow.

Speaker A:

And I'm like realizing now, like, talking.

Speaker B:

To you, every one person who has stolen medical.

Speaker B:

Medical information is worth $1,000 on the black market.

Speaker A:

How do you get onto this black market and black web?

Speaker A:

How does this.

Speaker B:

We won't talk about that.

Speaker B:

I don't want to get blocked.

Speaker A:

I just am saying, like, I don't understand, like, how these people have access to these things.

Speaker A:

That's so confusing to me.

Speaker B:

I mean, I'm glad I.

Speaker B:

You build them.

Speaker B:

It's all built.

Speaker B:

I don't cannot buy any of this stuff anyway.

Speaker A:

I don't have no idea what that means.

Speaker B:

All right, so going on to the next one.

Speaker B:

Reputation.

Speaker B:

I mean, I think the phone call one shows reputation.

Speaker B:

Fake AI images of, like, somebody, I don't know, naked or opposing, like, doing stuff on there.

Speaker B:

And then also, like, let's say, like, you're working for a company and you're in charge of, I don't know, let's say Disney imaging.

Speaker B:

Somebody hacks into your computer, steals all that information.

Speaker B:

That's going to affect your reputation.

Speaker B:

You might get fired for that.

Speaker A:

Whoa.

Speaker A:

Or if they plant things as well into your computer because you don't have it.

Speaker B:

I know people that were.

Speaker B:

I was working at Disney with.

Speaker B:

They checked their computer.

Speaker B:

They had bootleg DVDs on their computer.

Speaker B:

They got fired and sued.

Speaker A:

Whoa.

Speaker B:

So your reputation's huge.

Speaker B:

And it's not just like, at home or with your family or friends.

Speaker B:

I mean, could you imagine what like a politician, like, you could fake so much stuff about politicians.

Speaker A:

I heard not going to say anything about the Clinton's laptop.

Speaker B:

So this obviously.

Speaker B:

So privacy concerns.

Speaker B:

Something we don't think about is all our data is being used to market towards you.

Speaker A:

Oh, yeah, it's being sold.

Speaker B:

So, like, you know, like, when you're just talking about that one product, like, you remember that we should really think about buying that.

Speaker B:

And you jump on your phone and there's an ad for it.

Speaker A:

Yeah.

Speaker B:

They're listening.

Speaker B:

Your Amazon, Alexa, whatever.

Speaker B:

Listening.

Speaker B:

Your cell phone's listening.

Speaker B:

It's not just recording you, it's recording images also.

Speaker B:

You've seen that video.

Speaker B:

I showed you that video where it takes a photo of you every, like 10 seconds.

Speaker A:

Yeah.

Speaker A:

So it showed there were guys that were out who were like Navy SEALs and they were wearing the green.

Speaker A:

Or, you know, the night vision.

Speaker A:

Yeah, the night vision goggles.

Speaker A:

And they showed that, like, one of the most dangerous things for them to have on them is their phones.

Speaker A:

Because if you pull it out and it has face recognition, every 10 seconds, it's scanning your face and they can see that light.

Speaker A:

Scan through night vision goggles.

Speaker A:

Yeah, because you're getting that scan over and over again.

Speaker A:

There are also on that website, Tech Wellness or whatever, you can buy a little thing to put on the front of your phone where you can have that slid shut.

Speaker B:

Yeah.

Speaker A:

People have it on their laptops and stuff as well.

Speaker A:

Just because you can literally be recorded or watched, the government has permission to go through your phone or through your computer and watch you at any time using those computers.

Speaker A:

Bush wrote it into.

Speaker A:

What is that act?

Speaker A:

That was the Patriot act.

Speaker B:

Yeah.

Speaker B:

After 9, 11, he snuck it in there.

Speaker A:

Yeah.

Speaker A:

They can listen to you on your phone and they can also see you via video.

Speaker B:

Yeah.

Speaker A:

If necessary.

Speaker B:

And then like, also, I just want to say, like, yes, the government's watching, but whenever you download a new app now that app is watching is recording.

Speaker B:

One of the biggest mistakes you can do is link a new account to your email or your Facebook.

Speaker B:

When you download an app and it's like, make a new account.

Speaker B:

Do it the manual way.

Speaker B:

Type in your email, type in a password.

Speaker B:

Don't give them permission to look into your email.

Speaker B:

Look at your photos, to look at all this data, your social media.

Speaker B:

It is a huge security.

Speaker A:

It's annoying to keep track of all those passwords, but it's definitely worth it.

Speaker B:

One thing I definitely recommend, look into second authorization.

Speaker B:

Ways to protect your identity, protect your accounts.

Speaker B:

So that way they're.

Speaker B:

They're not all connected to each other and it's just one password.

Speaker A:

Is that what they Google?

Speaker A:

They Google second, second Authorization.

Speaker B:

Yeah.

Speaker A:

Okay.

Speaker B:

There's apps that will do it for you, there's websites will do it for you.

Speaker B:

And then.

Speaker B:

And then for identity theft, I think everything we just talked about is very clear that your identity is very sensitive and is able to duplicate very easily.

Speaker B:

Now in today's time and era, it is no longer a day where you should be so loose with your information.

Speaker A:

Yeah.

Speaker A:

That's how two of my friends that passed away voted in the past two elections.

Speaker B:

Yeah.

Speaker A:

Can you guess for what side?

Speaker A:

So, yeah, if you guys, this is also a very general episode.

Speaker A:

If there's something that you want us to go more in depth over this kind of tech episode, we would love to just email us and let us know we know this episode's a little bit longer.

Speaker A:

But that's because we wanted to cover a lot of different mediums and also try to dumb it down in the nicest way possible.

Speaker A:

Because I do not understand these things and I know if I don't understand, no one else is going to understand.

Speaker A:

And maybe some of this was already super easy for you and you guys knew a lot.

Speaker A:

But I.

Speaker A:

The reason why we saved the data information when last is because it really is connected to everything else that we spoke about first.

Speaker A:

So kind of tying it all together in that way.

Speaker A:

We just hope and pray that you guys are wise with the way that you spend your time on any type of device and the way you use your device is wise in your home and in your lifestyle.

Speaker A:

And also as well, making sure that in this specifically in tech, it does not rule your life in any way, shape or form, because Christ should only.

Speaker A:

First and foremost.

Speaker B:

Yeah, you said that well, love, that was really good.

Speaker A:

Thank you.

Speaker B:

Two other things I kind of want to add.

Speaker B:

One thing that two, gps.

Speaker B:

Your location is constantly being tracked and there are apps that are projecting your location.

Speaker B:

I believe it was Snapchat where they had this huge lawsuit against locations where people were being stalked.

Speaker A:

Yeah.

Speaker B:

If you have access to someone's Google account or you stay signed in on someone's computer for your Google account, your Gmail, I can track your location.

Speaker A:

Yeah.

Speaker A:

Predictive analytics.

Speaker B:

Right.

Speaker B:

It is all there.

Speaker B:

You can go and ask them to delete it.

Speaker B:

You have to petition for it.

Speaker B:

You have to go on their website and be like, delete this, please.

Speaker A:

Yeah.

Speaker A:

And people who buy it are real.

Speaker A:

Realtors, real estate companies, law enforcement purchases your data, and a bunch of advertisers.

Speaker A:

So if you guys are anti those things, you better make sure your location is off in every way, shape or form.

Speaker B:

And then there's another thing for the cell phone and tracking location.

Speaker B:

If you have your Geico or your insurance company or a cell phone connected to your car app, they're tracking your speed.

Speaker B:

And if you're committing, you're breaking the law, they know about it.

Speaker B:

So in the future, when, I don't know, it becomes digital and they're like, we're gonna give you a ticket for not wearing your seatbelt.

Speaker B:

Your car knows you're not wearing your seatbelt.

Speaker A:

Yeah.

Speaker B:

And then the other thing is your data for DNA.

Speaker B:

Not talked about.

Speaker B:

It is a new thing.

Speaker A:

We had to deal with that recently.

Speaker B:

What is that company called?

Speaker B:

2323 andme.

Speaker B:

They're under a huge lawsuit.

Speaker B:

A lot of the data, your DNA has been stolen and by the government.

Speaker B:

Yeah.

Speaker B:

And by who?

Speaker B:

Who knows who else it's been sold to willingly.

Speaker A:

Your DNA is your genetic, biological makeup.

Speaker B:

I've heard a lot of different ideas of how they're going to use it.

Speaker B:

Can I say two?

Speaker A:

Okay, I'm scared.

Speaker B:

One is genetically modifying a virus that affects a certain genome and the human so they could kill off a certain type of people.

Speaker B:

And then the other one is to use tracking systems to track different genetics around the world and see where they are in the location.

Speaker B:

So they can actually specify like who is who in the world, which is insane.

Speaker A:

So Jeremiah and I just dealt with this at our 12 week appointment or no, 11 week, or something random like that.

Speaker A:

It was before we hit our second.

Speaker A:

Okay, 10 weeks, that's right, our second trimester, that we were offered to figure out the sex of the baby.

Speaker A:

But we would then also have to sign over the DNA of my baby in my blood and my own DNA would legally be property of the states.

Speaker A:

And I, when she said that, I was like, what?

Speaker A:

When I looked at the paperwork and I couldn't believe that I actually would have to sign it.

Speaker A:

We also just didn't want to know, like, if anything is wrong with the baby because we wouldn't be able to fix it anyway.

Speaker A:

We wouldn't want to terminate in any way, shape or form anyway.

Speaker A:

So it wouldn't.

Speaker A:

We didn't want to have any dark cloud over our pregnancy.

Speaker A:

But that was already like, I was kind of wavering.

Speaker A:

50.

Speaker A:

50.

Speaker A:

And I was like, I don't know, it's not that big of a deal.

Speaker A:

I don't think we have anything wrong.

Speaker A:

But when I found out that the state would own my DNA.

Speaker A:

Absolutely not.

Speaker A:

By the way, the.

Speaker A:

This is why also hospitals and the state that you're in are also Able to sell like your placenta and the stem cells from it and use it for any type of research that they want.

Speaker A:

And your placenta is worth over $50,000.

Speaker A:

The faster that they can harvest it and you can make money off of that, but the hospital in the state does.

Speaker B:

And then they charge you that like $20,000 medical bill minimum.

Speaker A:

Yeah.

Speaker A:

Depending on which state you're in.

Speaker B:

Crazy.

Speaker A:

So yeah.

Speaker A:

That's why you guys, it really does connect physically and metaphysically and we just wanted to encourage you guys to understand that this is equally as important as being chemical free in your home or buying local or whatever it is that is really important to you.

Speaker A:

You, your safety online, your children's safety, your loved one.

Speaker A:

Safety is so important.

Speaker A:

And if people poo poo this, it's because they literally have not done their research.

Speaker A:

Yeah.

Speaker A:

And yeah, you know what?

Speaker A:

Listen to both sides.

Speaker A:

Do research on both sides.

Speaker A:

See what the side who's like anti this has to say.

Speaker A:

Jeremiah and I did that.

Speaker A:

And it just was so clear how biased in our opinion, how biased it was that they just really want people to not care about the things that they post or say because then it benefits them, only them.

Speaker A:

It doesn't benefit you in any way, shape or form.

Speaker A:

So we love you guys.

Speaker A:

And babe, thank you so much for being my, not my co host today.

Speaker A:

We should have said that you were my interviewee.

Speaker B:

Co host is perfect.

Speaker B:

I'm permanent.

Speaker A:

It's true.

Speaker A:

He'll be with us again next week.

Speaker A:

But we love you guys and we pray that through this that you are able to make wise decisions holistically and especially holy istically and what that means for your online health.

Speaker B:

Yeah, be safe.

Speaker A:

Yeah, we love you guys.

Speaker A:

And no matter what you do, whether it's online or in person, please continue always to keep casting seeds.

Speaker A:

We hope you enjoyed learning how to cultivate God's creation from a biblical perspective.

Speaker A:

Holyistic health is to prioritize whole person wellness through Christ like and comment on what topics we're casting seeds or casting pearls.

Speaker B:

If you found this information provided useful, subscribe to our podcast for future updates.

Speaker B:

Leave a review to help us improve and share this episode.

Speaker B:

We would like to remind you before we leave that perfect health cannot be attained in this world.

Speaker A:

Only spiritual salvation through sanctification and repentance to God and turning away from sin will give you a perfect body in the kingdom come.

Speaker A:

Nourish yourself in the word, in prayer and in biblical fellowship daily.

Speaker B:

Thank you for joining us today and a special thank you to our listeners for making this podcast possible?

Speaker A:

Always praying.

Speaker B:

Keep casting seeds.

About the Podcast

Show artwork for Casting Seeds
Casting Seeds
Biblical keys to Holistic living, in a fallen world

About your hosts

Profile picture for Savannah Scagliotti

Savannah Scagliotti

▫️Host: Casting Seeds 🎙️
▫️Holistic Health Practitioner, Licensed and Certified Massage Therapist, Alignment Specialist & Western Herbalist
▫️Owner: Savannah Marie Massage
▫️Charter & Homeschool Educator
Profile picture for Jeremiah Scagliotti

Jeremiah Scagliotti

▫️Co-Host Casting Seeds
▫️Producer
▫️Editor
▫️Engineer
▫️Christian, Husband, Business owner