Show transcription
Scott Wyden Kivowitz: [00:01:00] Oh man. I am so excited for this episode. This is a one hour and 50 minute conversation that I had with my friend Frederick Van Johnson from This Week In Photo. If you don't know TWIP, his podcast was actually acquired and his, his community.
His website was acquired by Smugmug, who also owns Flickr. not long ago. And I sat down with Frederick to talk about the future of photography. This is part one of that episode. I split the episode because it was so long. I split the conversation to two, so they're about 45 minutes each. I hope you, you, you tune in.
Listen to the whole thing. See what, Frederick and I had to say about what is coming with the future of photography. Thanks for listening. Let's get right into it. TWIP Workflows, crossover. This is a unique. Thing that we're doing. It's very exciting. New to both of us, and I can't wait [00:02:00] to see where this conversation goes. Frederick, how's it going, man?
Frederick Van Johnson: It's going great. Yeah. I think, I think we're onto something here. You know, we're, we're crossing the flows, we're, we're experimenting with different prompt structures, right? That's, that's what we're doing with this show. And I think, you know, we're mixing it up a little bit, you know, so, you know, you're, we both have a ton of ideas over the different topics that we discussed that we're gonna cover in this in this episode.
And yeah, it makes sense. Like, you know, we've got, we've got your, your chocolate, my peanut butter. We'll put 'em together and we'll see what comes out at the end of the day. At the end of the day, the, the listeners will benefit because they get to hear a new voice on both of our platforms and, you know, and, and a mixture of ideas.
And just real quick before we, we kick it off for thew listening audience, just a, a heads up of what this experiment is about. And I know I probably covered [00:03:00] this future, me covered this in the intro for this episode. But I'm gonna rehash it just in case that guy dropped the ball. But the Scott and I had this, Scott actually had the idea of doing this sort of collaborative podcast episode and I agreed.
I thought, yeah, what a great idea. How come I didn't think about that? And that's where we both get on, talk about the topics that we have outlined and then, you know, kind of syndicate or publish on both platforms. So I'm excited to be here.
Scott Wyden Kivowitz: Crossover. Yeah. It's like, it's like, Chicago PD meeting, Chicago Fire Meeting Chicago Med, except it's only two of us.
Frederick Van Johnson: There you go. Something like that. I like chocolate and peanut butter. That's bad.
Scott Wyden Kivowitz: Yeah. Yeah, yeah, yeah. It all works. It all works. It all works.
Frederick Van Johnson: Yeah,
Scott Wyden Kivowitz: yeah, you, you and I have known each other for quite some time, and I'm glad, you know, I've been on your show, I think once or twice.
I can't
Frederick Van Johnson: so
Scott Wyden Kivowitz: remember if you were on the last podcast I hosted, but I'm glad that we, you know, we're doing this and [00:04:00] yes, for all the workflows, listeners, this is a fun thing. If it goes well, hopefully Frederick and I can do more of these. But this is, this is experiment number one and we'll see what happens.
So with that, let's, let's just, yeah, yeah's dive right in.
In-camera AI is already here
---
Scott Wyden Kivowitz: So this, this whole conversation that we're about to have is about a variety of different things with the future of photography. We're gonna ta touch on AI and a whole bunch of automations and things like that. The first thing I wanted to discuss was the fact that AI has actually been in our cabs for quite some time at this point, and. You know, I figured it'd be, it'd be nice to just touch on the different aspects of where AI is in our cameras, whether people realize it or not,
Frederick Van Johnson: Yeah. Yeah. It's in, it's in several areas in there. One of 'em out there, out there, the most obvious one is autofocus, and how autofocus is getting better and better is largely attributable to the artificial [00:05:00] intelligence or machine learning inside the camera themselves. And what is that, right? So, yeah.
Great. My, my camera has AI in it, but does that, does that mean it's an Android? No, it means it, the, the makers of the camera taught it to recognize certain things and understand those things and wait them differently in the frame and do things based on that waiting. I e o. A bird in the sky, maybe that's what they want.
Versus in the olden days, right, it was other ways that they detected what the, what the subject was, whether it be motion or contrast. Yeah, yeah. Those kind of things. Now it's, it can still do those. I'm sure they're still using those, but as you know, they're using all of this stuff in concert together to get that amazing result, like on the Nikon Z nine and the Z eight.
Now, I'm guessing, you know, the, the, the, the auto focus of those cameras is just, you know, kind of heralded of being borderline magic, right. That Arthur C. [00:06:00] Clark style magic where you know, and then cameras like Lumix on the Panasonic side, which is, I'm an owner of both, in fact, I'm on the Lumix camera right now.
But back in the day, and Panasonic is, is aware of this, one of their Achilles heel weak points has been the focusing on micro four thirds cameras. So, right. So, Add a little AI to the sauce and suddenly you get, you know, what photographers have been begging for, right? So,
Scott Wyden Kivowitz: Yeah. I mean, I feel like the AI for focusing started with faces, I think was the first. Recognizing overall faces and it, it worked well. Cause as soon as you return the head, you know, your focus would get a little confused if it was face recognition. Even that's gotten smarter to still follow the face as the face turns.
Right. You now following the, the, the head rather than the face. Then it moved into, to humanize, it's moved into animal eyes, which is really interesting.
But then moving beyond faces became object [00:07:00] detection. So now as you said, like birds
Frederick Van Johnson: Mm-hmm.
Scott Wyden Kivowitz: full people, so you can detect full people in certain cameras.
And my favorite, even though I don't shoot this, but I know there's a lot of at least there's, there's a lot of imagine users that I know of, and I'm sure there's a lot of Smugmug users and Flickr users that, that are, are doing this. But like car racing, right? So NASCAR for example, there's now cameras that can detect objects like cars.
Frederick Van Johnson: Mm-hmm.
Scott Wyden Kivowitz: by super fast. So, optic detection and then the, the absolute kicker is what I think, I don't know if Sony's doing it, but I know Canon and Icon are doing it now, where it's eyeball focus detection. So as you're looking through the E V F, it can detect where your eye is looking and focus on whatever.
So instead of you having to, you know, hit the autofocus button to focus the camera, it's just focusing on where your eye looks.
Frederick Van Johnson: You know what is that new Scott? Is that new? Cuz I remember. I don't, I [00:08:00] think it was Canon, I wanna say Canon, way back in the day, developed something, and I don't know if it was, if it's still in the cameras as I don't shoot Canon. I'm not sure what the, what the platform is like, but it, I remember there was this tech where it would look at your eye, it'd be looking like there was a little tiny emitter in the view, in the, the eye piece that would essentially, it was, it was kind of, I think it was very, it was smart and smartly executed, but very kind of, You know, ham handed because it was, instead of using, you know, the intelligence, the artificial intelligence was like, oh, I know this is an eye and there's a retina and it's looking over there and, oh, it, it's dilated or closed down, so it must be looking at this certain distance.
It's not doing all that stuff. It's that one was looking at the whites of the eye. So if it saw a lot of white in an eye, it knew the eye was probably looking in that direction because the retina was, would have to be over [00:09:00] there. Therefore, it could use that information and put the focus area in that area of the screen to give it a kind of a head start on what you're looking at.
I'm not, I'm, I know I'm not doing the, the technology justice, but that's kind of, that's kind of what I do you remember that? Do you remember
Scott Wyden Kivowitz: I do, I do. And I'm sure it was sort of built on that framework. Now it is what you just described, where it's, it's pinpointing everything, you know? So like, like the wedding industry now it's sped up so many photographers in camera workflow because now it's less, less half half touching or less, less af on buttons or, you know, back fuck ba back button focusing.
Frederick Van Johnson: Or just focus in general, right? Just focus, focus, focus. You know, I'm aging myself, but I, the first camera I owned was, well, I didn't even own it. It was a Air Force camera, and the first camera I was issued was a Nikon F three manual, right? That thing was full on manual. With a focus prism, your older [00:10:00] view, your older listeners will understand the focus prism that you had to get it together, and then your image was in, so you had the, the thing was, You know, not to make this a focusing episode, but you, the, your skill as a photographer had to be such that you could internalize making sure the shot was sharp.
I e locked in. If it was, if it was moving, that was a whole nother bag of worms. Cause you had to track it and, you know, keep the shutter button down. So it was that, along with all the other bits, whether it's composition, exposure, lighting, all that stuff. So all that stuff had to be in play in your head at the moment of capture.
And you fast forward to this glorious future we live in. And that part is lifted off of our shoulders. I don't, and I don't think. I don't think it makes, you know, people can argue this in the comments or whatever, but I don't think it makes you a better photographer or worse photographer now that you don't have to worry so much, if at all, about focusing.
I think it, it makes you, [00:11:00] if anything, it lifts the, the weight off of your brain to have to worry about it was that shot in focus that I thought about the composition and I wanted that tree there and I wanted this, and then I had to interact with the subject and get her to get him or her to smile and do all these things.
Now focusing is off the table, right?
Scott Wyden Kivowitz: yeah, I think the word you just said, interacting with the client more is like the key there, because now the less that you have to think about or like you're thinking about it, but the less that you have to, like physically, you know, your brain, some of your brain power has to be focused on.
Auto focus, for example, or manual focus, for example, or, or changing your shutter or changing your aperture, whatever it is, the less brain power you have to give to that means the more brain power you have to connecting and interacting with your, with your client.
So,
Frederick Van Johnson: absolutely. So it just gets us, it just makes us better. Yeah. And, and the other, I think just to, you know, put another nail in that coffin of that particular bullet
Scott Wyden Kivowitz: yeah. Yeah.
Frederick Van Johnson: is, you know, apple with their Apple Vision Pro headset that they just came out with. [00:12:00] So I'm, I've never touched that thing. I've probably seen as much as anybody else has seen about it, but it knows where you're looking as well, cuz it's got a series of cameras and sensors in there that are looking at your eyes so that, yeah.
So we can do all sorts of things like, what, what should, what are they looking at? What should be in focus, what should not be in focus? You know, how much processor speed should I apply to that part of the screen versus where they're not looking, all that stuff is happening. I gotta imagine those sorts of technologies, much like.
I don't know, like when, when smartphones launched, right? When, when, and started, started proliferating around the planet and became the norm. Other industries popped up because of smartphones and other technologies, like the miniature miniaturization of compasses to go in the devices, the accelerometer, the, you know, altitude sensor.
All those things went in miniaturized on a chip into this device to make that device possible. This new device, the [00:13:00] Apple Vision Pro has other technologies in there that I gotta imagine are gonna find their way into other industries. Some existing, some that we don't even know about yet are gonna show up because of what Apple and their trillions of dollars are developing into this, this kind of new platform.
Scott Wyden Kivowitz: yeah.
Well, I mean, I, not to get too sidetracked on like, outside of photography, but like imagine that technology that's in the, these, you know, VR goggles. That's cause that's what I'm gonna call 'em. You know. Yeah. Imagine that they're now in your car and you look off the road and now your car's yelling at you for looking off the road.
Frederick Van Johnson: Yeah. Well, cars do that now. They do that. Now they have that built into the Teslas and, and many of the electrics. So when there's self-driving in there. Right.
Scott Wyden Kivowitz: yeah.
Yeah. true.
true. Yeah. So
Frederick Van Johnson: you're, you're paying attention to the road, which I don't like. I don't, it is one thing to have that in my camera and to assist me with making a better photo.
Scott Wyden Kivowitz: your life's on the line.
Frederick Van Johnson: Yeah. To have a, well, yeah. I mean, I know it's for safety and everything, but I [00:14:00] just, I feel weird when I feel like, okay, someone's watching me, you know, and I'm driving. Even if it's an, it's an, an entity, an AI or my car or whatever for my own good, it is like, it feels a little invasive.
I don't know.
Scott Wyden Kivowitz: yeah. Yeah. I don't mind the car looking at the road for me, but I don't wanna
Frederick Van Johnson: Yeah. Don't look at me. Don't look at Yeah.
Scott Wyden Kivowitz: Yeah,
But so moving onto the
Frederick Van Johnson: doze off and hit somebody, then you're like, okay, well, why didn't you tell me?
Scott Wyden Kivowitz: Yeah, exactly, exactly. Damn. New car.
AI for keywording your photos
---
Scott Wyden Kivowitz: So, so there's a bunch of other new newer AI techs that tech technology that's making its way into apps. Some of, again, have existed for some time. That I think are for, for one example being something very beneficial to stock photographers, commercial photographers, landscape photographers, some cases potentially wedding photographers could utilize this.
But there's Excire photo, Excire search, Excire photo. This company who makes it detect it scans your, your, your photos. The [00:15:00] Excire search is a Lightroom plugin. Excire photo is a standalone app and they they, it basically looks at your photos, analyzes what the photo is, and creates keywords for you on a variety of factors.
And on1 just came out with theirs on, well, we're recording this on June 21st. It's a Wednesday on Monday. The 19th, I guess, on1 came out with their own version of that, which is a standalone app. There's no Lightroom plugin, at least right now. And it also does the same thing.
You know, you just scans your photos for keywords and adds the photos to the metadata.
You can then load it up in Lightroom and now you've got all those keywords in Lightroom or in Photoshop or wherever. And I feel like that's, I've used, I've used both at this point. I've only used on ones for a couple days at this point, but it's a, it's amazing. As, as somebody who loves landscape photography, I find it amazing how accurate and interesting, and the keywords I can come up [00:16:00] with.
So what are your thoughts on, on those?
Frederick Van Johnson: I, I agree. Yeah. I'm very, I, I know the folks over at Excire, you know, from the top on down, the great folks that are, that are, you know, genuine folks that are working on genuine software that solves a genuine I don't know if you wanna call it problem, but a an annoyance to photographers.
Scott Wyden Kivowitz: An annoyance for sure.
Frederick Van Johnson: Yeah. And it's my annoyance with Keywording comes in when I see friends of mine, like, I don't know, Photo Joseph or somebody like that, who's all in and meticulous on their keywording and the hierarchies and all that stuff, and everything's perfect. He could find, he could find the hair on a dog from 19, you know, whatever, you know.
And but. Yeah, so in step, a company like Excire and now on one with these, you know, a way to use artificial intelligence in this case, you know, AI is, AI is a weird term, right? Because it's, it's thrown around a lot. It [00:17:00] covers a whole bunch of things, but everyone thinks it's only one thing. And the case of this, what they're doing with the keywording is, again, it's back down to machine learning, i e brute force.
Feed the machine a bunch of photos so that it knows what's what. And then when it sees something that looks like that, apply that tag to it. That's all it's doing. So if it sees a red, a red dog or red a cat or something, it's gonna go, oh, that is a cat. Okay, and here's the tree for cat feline claws, blah.
You know, so it's gonna put all that stuff in there for you automatically, well beyond anything that you or I would ever feel like or want to do as a human. We can throw this kind of menial, grunt work over to the AI and have it do it more efficiently than we could ever conceive of doing. And that, that's, that's what it's for.
I mean, it's brilliant. I did a whole tutorial for the Excire folks, I think they may still have it on their site [00:18:00] where, and I kind of walked through the software and demoed all the different pieces of how, how it works and how it can fit in. And it's magical, man. It, it is. It is really magical just to be able to.
See, and you know, if you've been shooting for as long as I have, you probably have a library full of Imageners, whether they're personal or you know, kids and or work or whatever. But to be able to, with the power of a piece of software after it indexes everything to go in and say, yeah, show me every photo from when I went to Las Vegas.
Boom. Okay. Show me every photo for when I went to the MGM Grand, or every photo that has, you know, it is, yeah, it's,
Scott Wyden Kivowitz: yeah. A real world, a real world example of how I used Excire while working at Imagine, actually. So I've been at, imagine now for we're approaching two years already, and and it was a this la almost a year ago. In a couple weeks. So it was we were coming up to July 4th to Independence Day here in the US and our social media manager was like, Hey, [00:19:00] does anybody have fireworks photos?
And I was like, I do. So I loaded up Lightroom and at, you know, I was using Excire search and I just literally just searched for the word fireworks and I exported every photo that I have ever shot a fireworks ever into a, you know, bunch of jpeg for him to choose from. It took me two seconds and it was, you know, only due to having all my key, my keywords in place thanks to Excire search.
So real world, like I had a need and it solved it. So,
Frederick Van Johnson: That's why, you know, when people complain about, oh, AI's gonna take our jobs and blah, blah, blah, blah. Yeah. Of yeah, some jobs, absolutely. It's gonna, it's gonna eat 'em and spit 'em out, you know, that we could see it already. You know, I'd be, I'd be terrified if I were in some industries that are gonna be affected by this, you know, whether it be people that write stuff or, you know, even, you know, as we see things sort of unfold even, you know, in the future, I, I predict we're gonna see certain industries, like even maybe headshot, [00:20:00] photography, taking a hit, because at a certain point you're gonna.
Be able to get a scan of yourself, a highly accurate scan of yourself, that then you'll be able to just say, Hey, put me in a Corvette flying down a desert highway, you know, wearing a red scarf flow, flying in the wind, and there you are, photorealistic you. So there's these things. I, I feel like the, those kind of jobs or those kind of tasks are in the crosshairs of this AI stuff.
But then there's the other things also that get painted with the same AI brush, like keywording and these more, more mundane things that we are like, oh, thank you AI for doing that. Or, I'm a, I'm a Grammarly holic, you know, the app Grammarly.
Scott Wyden Kivowitz: Same.
Frederick Van Johnson: I use it to appear more intelligent than I am every day. Un and I'm not embarrassed to say that every day I use it. It's
Scott Wyden Kivowitz: Right. Right now we have Grammarly is [00:21:00] in, is in attached to Frederick's brain. And right now we're here the intelligent Frederick, but once we remove that chip, he goes back to the
Frederick Van Johnson: You know, it's low key. It is low key. Mrs. Mitchell, Mrs. Mitchell was, was my English, one of my English teachers in high school. And I remember the, yeah, I, I, she would correct me on certain grammar mistakes all the time. Like, oh, when you do a quotation mark, the, and you're quoting at the end of a sentence, the period needs to be inside the quotation mark, not outside, even though it looks like it should, you know, things like that.
Or this should be hyphenated, or that should be, or even the dreaded, where does the apostrophe go?
Scott Wyden Kivowitz: Mm-hmm.
Frederick Van Johnson: You know, especially when there are name ins and s and you're talking about something that's not, you know, so those kind of things that trip up people, Grammarly catches those for me and over time, if it catches them enough, you know, I'm like, oh, I, I don't make the mistake anymore.
So I don't make the mistake of putting the period outside of the quotation mark anymore [00:22:00] cuz Grammarly has hit me over the head so many times. It's now, you know, I do it right. So I, I would argue that I'm smarter. Or at least better grammatically than I was a year ago because of using this software and or using these tools like, like Grammarly.
It's crazy.
Scott Wyden Kivowitz: yeah, yeah.
Generative AI images
---
Scott Wyden Kivowitz: Totally. Totally. You brought the headshot thing, we're gonna get back, we're gonna get to generative Imagen. Do you wanna talk about that now and go back to the other, to the other two topics? Or do you wanna
wait and come back to, all right, so, so, so generative Imagen, right? Generative ai Imageners as officially termed or whatever.
I dunno. I am. I have a love hate relationship with these. I can find them to be useful. The whole headshot thing, like there's a website that exists. I tried it. You pay 20 bucks or whatever, you upload 20 or however, four 20 to 50 something photos of yourself, and then it literally creates headshots for you.
Frederick Van Johnson: Yeah. Lensa lens.ai is the, I think that's one of the tools that does that. Yeah.
Scott Wyden Kivowitz: that's one of the tools. There's actually one [00:23:00] dedicated to just headshots
Frederick Van Johnson: Oh, is it? Oh, oh, I need to know the name of that. I'm gonna try it.
Scott Wyden Kivowitz: Yeah. I, I'll look it up when we're, when we're done, but it's it, it does a, a decent job. I mean, it still doesn't look real. Some of them don't even look like you, but it does a decent job. In fact, at ShutterFest you know, like at, you go to a trade show, you go to WPPI, you go to Imagen, whatever, there's always banners of the speakers and of the, of the sponsors and stuff like that
as ShutterFest.
On the trade show floor, they had banners and this year all the banners were of the speakers as ai. And it was the talk of the show because
Frederick Van Johnson: Yeah,
Scott Wyden Kivowitz: nine times outta 10 it didn't look like the speaker at all. So, so it's interesting.
Frederick Van Johnson: like what the speaker wished they looked like, right.
Scott Wyden Kivowitz: yeah, yeah, yeah, yeah. You know, I've, I've played around with a lot of the, the gen generative I AI Imagen tools, and I've seen, [00:24:00] I've seen workflows like Sam Hurd, for example, just put out a YouTube video showing how he's automating, going from his Canon camera to Dropbox or to a, to his FTP server.
And then mid journeys, taking it, taking that and creating four potential variations on the photo and putting in a Dropbox folder in, within like a minute, all in real time. And so it gives him to be able to like shoot with a client and then. See a variation on what, maybe you can try something else, you know?
And so I, there's, there's potential really cool uses for it. I'm not a fan of just the create this image and it outputs it. Right. I, I, I do like the whole Photoshop thing where it's like, expand the scene, let it take your photo and some AI and, and fill in. But
Frederick Van Johnson: yeah,
Scott Wyden Kivowitz: yeah,
Frederick Van Johnson: there's so much, there's so much to unpack there, and
Scott Wyden Kivowitz: I know.
Frederick Van Johnson: that. Thank you for bringing up. Like I said, I, like I told you before, I just got back from Infocom in Orlando. I was doing, I did a [00:25:00] couple of, I sat on a panel that was talking about the future of streaming and how AI might, might affect that.
And then I did a talk on how you, you know, basically it was my workflow and, and different ways that I'm using AI to enhance my workflow that I wasn't doing a year or so ago. But, you know, it's the, on the generative side of things, it is, it, it, like I said in the talk, it's, it's, I'm cautiously optimistic, but I, I lean more towards optimistic on where this stuff is taking us.
I am heavily more towards optimistic but with a healthy dose of pessimism and fear. Right. To keep us, keep us alive. Right.
Scott Wyden Kivowitz: Fear. Fear.
Frederick Van Johnson: yeah, fear. Yeah. But you know, the, when I, when I look at this stuff, especially the generative stuff, I can't help but be excited. To go back a little bit to your, to your, your thoughts about just sort of headshot photography, not being there yet, where, you know, people upload a series of Imageners and it does the thing [00:26:00] we're in pre-alpha, I think for this stuff, right Though you, we literally, a lot of people don't, we're at the point now, like right now, you'd be hard pressed to find a person on the planet that doesn't know what an iPhone is.
Right. It wasn't always that way, right? It was at a certain point there were just a certain group of people, insiders that you know, that knew about it or knew it was coming or whatever. Even after all of Apple's launch plans or whatever, there's certain contingent of the population on the planet are like, what?
What's that? What's the iPhone? Whatever. And now of course everybody knows we've been marinating in it for years now. So everyone knows their iPhone is, and it has advanced to the point where these things, our phones are doing things that where we never conceived that they would go, right? They've birthed billion dollar industries that we had no idea we needed 10, 15 years ago like Uber and Lyft and those kinds of things.
So I look at ai, especially on the generative side of things as being kind of that we're at the, we're at launch [00:27:00] at the beginning of all this. So you extrapolate out 10, 15 years or so. What, that's where photographers I think need to be looking like kind of skating to where the puck is going to be. So if you look at where this stuff is pointing to right now with what Meta is trying to do, what Apple is trying to do, you kind of zoom out a little bit and look at what hap what's happening with crypto and N F T and how that's gonna work into this and mid journey and Photoshop with Firefly and how that's gonna work into this new, kind of, these mini worlds that are, are being built.
Look at what NVIDIA is doing with their hardware. You know, it's hard to think of a, of a scenario when super realistic unreal engine quality avatars of our, our persons are not gonna be one of the next kind of Uber or holy grail companies [00:28:00] where. It's like, okay, yeah, you can Instead right now, it's like, okay, upload a bunch of photos, 2D, crappy photos of yourself, and the AI will do some interesting tricks with it, right?
That's kind of where we are now. But imagine the world where you can go to a facility somewhere, I don't know, maybe it's a photo studio of the future and get a full on perfect scan of your entire body as it stands right now. You know, of course you'll be able to make modifications to it, you know, if you don't like the way you look or whatever, but you'll have that data at a super high resolution.
You know, on a super high resolution scan that then you can do other things with, like I said, you know, put yourself in different situations, show up in Zoom calls render a headshot for whatever. Will headshots even be necessary if somebody can just call up your, your virtual profile on LinkedIn of the future?
So, you know, it's, it's those things when I look at these technologies, Adobe and, and their, [00:29:00] their forward thinking to put generative AI as a tool within Photoshop to give us a taste of what could happen. Brilliant. Yeah. Brilliant. And it's scaring, it's scaring the crap out of a ton of people, I think, cuz rightly so.
Head shop photographers, commercial photographers, like in that, that same ilk, right? Scott. So the same ilk of, okay, I can go to a facility and get myself scanned in. Now I can put myself in whatever situation. I don't have to worry about going get a photo shoot. All my family's been scanned too, so I could do a Christmas photo.
You know, that's, that's appropriate for the time. And we're all in Christmas suits. I could generate that if I want to now. Of course. Yeah. You probably wanna just shoot that because it's personal. But you could generate that. I think about it from the impact on commercial photography though. So what happens when company A decides, Hey, we have this widget that we need a bunch of photos for, for the website.
Right. And they just get that widget scanned in. [00:30:00] Much like car companies are doing today. They just scanned, they have full 3D representations of their cars. They can do whatever the heck they wanna do with. Imagine when that's, the whole world can do that with anything. Right. So yeah, it changes. It changes a lot.
Scott Wyden Kivowitz: Yeah, it's, it's, it is, it's, it is definitely a little bit of caution, but a little, a little bit of fear, but a whole lot of optimism. It, it's a definitely mixed emotions when it comes to generative, generative ai, but I, I am a hundred percent in agreement with you. I think what, what Adobe has done so far it's the natural progression of here's what people are doing to manipulate photos already.
Let's give them this AI to just take it to the next level and make it easy. Instead of having to go to mid journey and do this to then go back to here, then go back to here. Just have it in one place and have it easy throughout the, throughout the software. I think there's still, it's definitely a way to go.
I [00:31:00] still think like, yes, it's beta, but it's definitely still alpha There's a way to go. I've did, I, I put out a video a few weeks back. It's the, the good, the bad, the ugly with, with Photoshop's generative AI that's built in. And I showed, I showed what it did great and what it didn't do so great.
Like when I asked it to replace a sock with a bird and I made a sock bird puppet, and I was like, it's not really what I asked it to do. It's funny, but
Frederick Van Johnson: It. Maybe it has a sense of humor, I dunno.
Scott Wyden Kivowitz: yeah, maybe it does, but so let's, let's go to something a little more less, less scary in, in some cases very, very productive, very workflow oriented and actually happening now, which is cool.
Camera to cloud automation and AI
---
Scott Wyden Kivowitz: Camera to cloud. So this is something that is being pioneered and spearheaded by Fuji, Fuji Film,
Frederick Van Johnson: Mm-hmm.[00:32:00]
Scott Wyden Kivowitz: imagine Smugmug. Who else is in on Skylum is in on this. And of course, frame io, which is Adobe, they're in on this. So this is something that was demoed lightly at ImagingUSA and by lightly, I mean informal, but they still put on a presence and then more heavily demoed at WPPI and it was so cool.
So the concept is and even though it's a proof of concept, it's, it's happening, like it's, this is happening. And again, it started with Fujifilm. There's a good chance, very good chance that a lot of cameras like Panasonic and Nikon and Canon, we'll be doing stuff like this. But you're shooting in camera with right now the Fuji Film cameras, two of them that do this, the ca, the photos go directly to Frame io as well as the memory card.
So you got it in in both places. You got your RAW file. [00:33:00] On a memory card, you get a RAW file that went to frame io directly. No phone needed in between. It is then cold and edited by imagine's ai, right? So right now it's being edited by Imagen. Culling can happen in the future just by flipping a switch, right?
But it's Editing, it's being edited by Imagine again through the cloud, and then it's being sent to a Smugmug gallery.
Frederick Van Johnson: Yeah.
Scott Wyden Kivowitz: My then my favorite part, the RAW file sent to Smugmug with the XMP of the edits sent to Smugmug and the outputted jpeg sent to Smugmug in a Smugmug gallery ready for your client to see and order prints from?
Frederick Van Johnson: yeah. Yeah. It is, it is. It feels like that's the holy grail, right. For me, it feels like that is the holy grail that, that a lot of photographers in a lot of different industries have been chasing for a while. Photojournalist, for example, in sports photographers, like you're, you're at the [00:34:00] game, you know it in the olden days, well, the old and olden days you were shooting films, so there was that gap, right?
And then when it went digital, it was the gap of the time that you can get over to someplace to go through your Imageners and upload them to AP or wherever, you know, then you, and now with this camera to cloud piece, we're kind of removing that last bit out of there as well, which I think, which, which is a, a couple of topics to touch on there.
And the first one is, yeah, the convenience and the, the, the way that I feel like they're designing it. The technology to work and Yeah. You know, for folks that don't know, just, just to set the stage here. My company this week in photo, the podcast, the educational resource, et cetera, community was acquired last year in 2022 by Smugmug to go along with Flickr and Smugmug as one of the three brands over there.
So, I had a heads up that this was coming, right, the whole Fuji and camera to [00:35:00] cloud partnership between those, you know, between Smugmug and Fuji. And you know, I look at this and I think. Okay. At its core, this, when I first saw it, I was like, okay, at its core we're we're replacing that long tether tools cable that I have.
You know, when I'm shooting tethered and I got somebody in the studio and I shoot it and they can see it on the screen and Oh yeah, that's good. Yeah, but maybe try not to smile, whatever. So you that that's the flow I was thinking. So that bandwidth of sh squeezing, that jpeg, that x and p Sidecar file, that RAW file, all the data is, in the case of tether tools, is going through that cable, that orange cable to your computer.
In the case of this, it's going to the cloud. I love my orange cables. Yeah. I had to throw one away yesterday. Oh cuz it was old. It was an old connector. But the, yeah, but instead it's going to the cloud and then coming down to wherever based on rules or where you want it to go. Like it could go to a Smugmug gallery that has certain permissions [00:36:00] and access privileges set on it already.
You know, for me personally, that's scary cuz I don't. For, yeah, I want it to go up there for me, but very rarely are my shots ready for human consumption, you know, right outta camera. I need to do a little, I put some Frederick seasoning on them before other humans see them. So there's that. But if you're doing photojournalism and you're out the field somewhere and the news needs to happen right now, boom.
And it's on the editor's desk and they can make the decision on what, what goes where. I think it's brilliant. But, you know, this isn't this, unless I'm missing something. This isn't necessarily brand new technology. The, I think the tech is new, what they're building. I have no idea what magic they put in the software and hardware to make this work.
But you remember back in the day, there was a company called ifi. You remember ifi?
Scott Wyden Kivowitz: Yep. The memory card. Did the, did the
Frederick Van Johnson: memory card. They had wifi, they had squeezed a wifi transmitter onto a little SD card that you'd stick in your camera and. It would [00:37:00] do something similar. You'd shoot, it would record to the chip to the memory of the card, but then also go to IFI servers, which would then, based on your rules, syndicate that image out to different areas.
The jpeg, of course. Right. But it, you could have it go to Flickr, you could have it go to Smugmug, you could have it go to wherever, you know all these different places at the point of capture. But it was buggy and it was slow, and it was JPEG only and it was slow. And it was slow. So it didn't work. It didn't work.
Scott Wyden Kivowitz: I think the, yeah, I think the magic what of what Fuji Film has done is they built it with, I believe it's wifi six E or whatever the latest wifi is. So it's like the fastest possible wifi right now. And because it's built into the camera itself, not a memory card doesn't need that, that addition, it doesn't lose any of that transmit data that a memory card would give.
So you're getting the, the, the absolute facets it can possibly process through the [00:38:00] wifi. So if you're on a wifi six network, you could potentially get almost a gigabyte speed of transfer through wifi. Right. So yeah, it's, that's, that is, that is a beautiful thing. And just to touch on the whole, like, you know, wanting to add the, the Frederick special sauce after you shoot the photos is, that's where imagine comes in, right?
Like, imagine learns how you edit. So like, You're getting it to your, to your Smugmug gallery, but you're actually getting it edited the same way you would. So that's the, that's the beautiful part. But I, and
Frederick Van Johnson: Right.
Scott Wyden Kivowitz: I, so I haven't, I have one of the Fuji prototypes the Fuji Film prototypes.
Frederick Van Johnson: Oh, cool.
Scott Wyden Kivowitz: I I haven't fully tested it as far as can it, can I choose which photos go?
Frederick Van Johnson: Mm-hmm.
Scott Wyden Kivowitz: but at the same time, like, do I want to choose which photos go, because then I'm, now I'm slowing myself back down again. You know what I
Frederick Van Johnson: I want the choice, I want, I wanna be able to make the choice of whether I want everything that, every time I press the shutter, whatever you record, put that [00:39:00] somewhere, or I think what, what IFI did back in the day, if I'm, remember it's coming back to me, like on the Nikon, I shoot Nikon. So on the Nikons at least, there was there's a lock button on there where you can, you can actually lock an image.
So they would, you could set it so it would only send up the images that you locked, which was cool, right? Because you could be shooting, you're like, yeah, that one needs to go, okay, no, this one needs to go. Or later when you're reviewing, you could pick and have those automatically go up. So something like, I think it's not a, it's, it's not a one size fits all.
It's gotta be a. Situational type thing, but I, I, the, the speed you know, that, I think that's, that's the part where I need to see this work. I need to, I need to try it in person, on, on something that's mission critical to see where the holes are. Cause that's the only, the only time you'll see if it's actually viable, you know, to put it,
Scott Wyden Kivowitz: mean, I
Frederick Van Johnson: it into battle.
Scott Wyden Kivowitz: yeah, at at Imagen and at W P P I, which doesn't have great wifi on the trade show floor and around the hotel. Right.
They, so,
[00:40:00] yeah. At at Imagen, I think they had about, I wanna say like 15 prototypes around the trade show. All people shooting all at once.
And at W P P I, it was even more, it was probably over 20 prototypes all at once. And they were constantly, constantly going and showing the, the results. So
Frederick Van Johnson: Well, Scott, you, you played, so you've played with, it's a great person to ask this question to. So you've seen it in action, you've played with it. You mentioned briefly earlier that you know, your, your phone, you don't, you don't need to phone. And I think you were saying that more from the standpoint of you don't need to pair and then copy from the, the camera to your phone and then send it somewhere that way.
But it is still using your phone, right. For bandwidth.
Scott Wyden Kivowitz: directly.
No,
Frederick Van Johnson: is there a, there's a cellular chip inside these Fuji cameras
Scott Wyden Kivowitz: no, it's, it's wifi. It's wifi.
Frederick Van Johnson: also, so there's, there's a wifi chip inside it, so, which means it needs to connect to the local wifi and send that [00:41:00] way. So you're still kind of at the, the, you know, you're, you're under control of whatever wifi connection that you're connected to.
And the, but like you said, in the, in the case of a trade show, everybody and their mama is connected to that wifi.
Scott Wyden Kivowitz: To the really horrible wifi.
Frederick Van Johnson: Is it? And if it still worked in that situation, then I think we got something right? So, yeah.
Scott Wyden Kivowitz: in mind, like if you are let's, let's go back to the NASCAR thing. You're shooting nascar, maybe you've got yourself a 5G hotspot and you're, it's only you.
And, and now you're good. Right? And I, I have to double check the, I have the camera packed right now cause I'm gonna be using it next week, but I think I'll have to double check.
I think I might have an ethernet port as well. But I have to double check.
Frederick Van Johnson: Oh, the, the,
Scott Wyden Kivowitz: cuz it's a grip
Frederick Van Johnson: oh, the camera itself gonna like, uh uh. Oh, wow.
Scott Wyden Kivowitz: think. I think the grip might have it cuz the, the one that they gave me has a, has a grip attached. So I have three batteries at once for the
Frederick Van Johnson: If that's true, double check [00:42:00] that cuz if that's true and you can
Scott Wyden Kivowitz: you have gigabit.
Frederick Van Johnson: Well, yeah, well, you could plug, and especially if there's, if they're doing power over ethernet as well, so if you can bring in an ethernet cable and have your, your camera charged at the same time and, you know, basically tethered and whatever it shoots, goes back, you basically have a remote camera at that point where you can set it up and just, you know, go to town wherever you happen to be, as long as you have that, that connection.
But wifi, having the wifi connection, having the wifi connection is the, that's the, I guess that's the week link, right? For now. Right. So you need to have, you need to have one of those pucks or whatever to give you that 5g, wherever, or a starlink account or something, you know, so that you could shoot and share, but you still, you, you need.
You know, it's not magic. It's not gonna create something from nothing. So you need, you need inner or wifi access in order to make this work, which I think [00:43:00] more and more is becoming ubiquitous. A lot of today's cars have wifi built into them,
Scott Wyden Kivowitz: exactly. Exactly. It's the future, the future's. It's here. So,
Frederick Van Johnson: Yeah. No, for sure.
Scott Wyden Kivowitz: This was part one of my conversation with Frederick. I would love for you to tune into the second episode, part two of this conversation, which is also live right now in all the podcast players. So head right over to it. Check out part two right now.