PodClips Logo
PodClips Logo
PROOF
BrainDrops: A Platform for AI-Generated Art
BrainDrops: A Platform for AI-Generated Art

BrainDrops: A Platform for AI-Generated Art

PROOFGo to Podcast Page

BrainDrops , Gene Kogan, Justin Trimble, Kevin Rose
·
19 Clips
·
Jan 11, 2022
Listen to Clips & Top Moments
Episode Summary
Episode Transcript
0:02
If you're using Photoshop for example to an artist using Photoshop, or they have to go through these kind of manual procedures with a lot of tools to to get the results they want and with text. They just have a much faster and richer interface to controlling these because you know this yes is the most natural way for humans to communicate and so it really makes sense that I could see this evolving too.
0:30
Not just text image for the sake of art, but text becoming the default interface for computers.
0:45
That was Gene from raindrops a platform for a i generated art. So with more and more artists publishing their first in FTS every single day, there is little doubt in my mind that Discovery is going to be very important. Meaning. How do
1:00
Sort through and find. Great are when there's hundreds of thousands of new entities being minted each day. So Discovery will come in many forms. Of course, there's going to be art critics will be influencers will be Dao's. We have our discourse that we're all in and there's going to be purpose-built platforms for certain genres of art, a couple examples of that are quantum art. They've done a great job so far with photography and FTS and of course, our blocks for generative art, but I believe there is another very important.
1:29
Tintin, genres of art emerging, which is a IR. So, simply put brain drops hopes to be the art blocks for AI art. Now, they're just getting off the ground and I've only published four drops so far. But if the thesis is correct and a, I are really takes off. This really could be an early glimpse of a very important platform. So let's chat with them. This is Justin and Gene from brain drops. Justin Gene. Thank you so much for joining me on the show.
2:00
Absolutely, man. Thank you. Yeah, excited to have you guys. You would say in terms of the projects that people ping me about the most but I have yet to go deep on. I would say that raindrops. Probably number one. I've had a lot of friends reach out and say this is like the new hot thing. You need
2:19
to look into this. You should get
2:20
involved somehow. So I'm thrilled to have you on the show because this gives me a chance to speak directly to two of the three co-founders, which is awesome. And just
2:29
learn all about your vision. Where do we start? Where, how did you all come together to create this thing? I've been working on Art and trying to end just the merging of Art and text along time. Trying to figure out started out with VR and art and trying to see how to preserve art inside of a virtual reality environment. And that was around like 2015 2016, like right around like the early days of the Oculus that development kit stuff and then about a year into
3:00
Around with that stuff was when crypto Punk's came out and immediately just stood out to me. This is you know, perfect blockchain and I had been into blockchain, but I had never really put all the pieces together. I'd never had occurred to me and the provenance and the preservation aspect. It was it just was a perfect fit for our Tatsumi. So got super excited about that. Do have heavy into the idea of it claims punks and then got back into kind of VR stuff.
3:29
A while. And then I was always anonymous because I had also had a lot of family in traditional financing always. Was it was always taboo at the time if you're involved in traditional Finance to talk about crypto. So I kept it on the down low. So all my accounts on Twitter were like Anonymous, and then in 2020, I decided to start talking from my regular account and just become not Anonymous anymore. I was going to join the larva Labs Discord as myself and just got super.
4:00
Heavy into promoting crypto planks and trying to just educate people on nft is what they were about, made the Wikipedia page for punks and also got involved in some of the other Punk projects. So obviously like the biggest one was our blocks and our blocks like really introduced me to the possibilities of Jen art in general and it got me thinking like, inside precipitated in the test - just admit it.
4:29
Participated in minting from the first day. It was obsessed with Crimea squiggles. It's amazing because there was more going for next to nothing back then, right? Yeah, absolutely. And like the first on day, one, it was three projects and I think both of the other projects were like 500 quantity and then you saw and then the snow fro had his cronies Wheels with for 10,000. And at the time, it seemed like, oh, that's like a massive Supply. So, everybody went for the others first, just being a lover of
4:59
Using that that's no froze. All about and stands for as like, I got to get these two and then getting them and seeing all they're all different and they're these different categories. It was just like my mind was just expanded. Honestly. I just saw the potential for a really cool way to create a collection, that would be so different from. Because, at the time, I, for a long time, I was very like, thinking one of ones where it was at and abundance, could be, like, harmful in a way. But so it was is more like all about
5:29
Ersity, but it was a cool thing with our blocks of growing with our blocks and seeing like the way that this network effects worked when you had a collection and then you have a whole bunch of people who are all collecting the same collection. They're all into it. They're all talking about it. Also it just, it makes a more fungible situation. So somebody, somebody that I donor in the collection sells a piece for a certain amount, and then the rest of the know, none of the other owners want to self.
5:59
Less than that. And then it was just a very cool kind of vibe. It just really got into it from the economic standpoint from the art standpoint and the tech standpoint. It was all the interests, all in one. So from there, I'm thinking, how can we, it was really cool to see these artists who had been just out of passion, creating generative art work. So how could other artists they're doing something similar? Or how could this work in another kind of?
6:29
Of field, right? And you know on Down the Line like racking brains for quite a while. It just became pretty apparent. That a I was the place for it and reached out to some people that I knew that were involved in AI, they introduced Gene and I and and really that's where Gene came in.
6:51
Yeah, my my journey to this has been through. Yeah. As Justin said, through through being an artist and working a lot digital art, originally my back.
6:59
On the was an engineering. I became interested in machine learning in college and work for a couple of years. After that, as a researcher system on this. Now, defunct company, this project called thurstone, which is trying to create kind of Pandora Style, music recommendation except based on the automatic analysis of the audio, trying to get the emotional kind of valence and mood characteristics. So you could do a therapeutic playlist thing and stuff, but then that kind of led me down.
7:29
This road of music and machine learning that eventually led me to Art and machine learning and generative art. And as Justin said, I started working with Ganz quite early just they've been around for a couple of years now really exploded in the last few years. I got interested in aetherium like around 2016-2017 at the time that there was a lot of talked already about different kinds of crypto art applications. This is still before and if T is it got really big a lot of people were interested in applications of actual just your seat.
7:59
The tokens not fungible tokens for for art you might recall things like token, curated, Registries and bonding curves. And the yeah, I started selling and I've teased when those became big. I really underestimated them in the beginning. Like Justin. I think it took me a little while longer to get to get how nfc's would work so well for art. And then you have men that kind of somehow let Justin to finding me. And yeah, then the we decided to go for bring your
8:24
apps. I'm really curious Gene given that you were so early on in the Gann side of things.
8:29
Things. I would love to know who are the players around that time because I've heard multiple people speak of Robbie broad is being doing some of the early can stuff as well. Do you have the, the kind of history there? Did you interact with Robbie at all?
8:44
Yeah, I knew Robbie pretty early because I was I've been working on this educational project called ml. Machine learning for artists for a long time. It's trying to get artists started with machine learning and I had a slack channel. It's not so active anymore. But at the time, maybe 2000 at that time, I think 2016-2017 Robbie was on that slack Channel and kind of sharing his work. And before he was known to Too Much the, it started a few years before that though. It was so Gans were invented in 2014.
9:14
And in 2015, I think some of the that's kind of when deep learning using AI to make generative art. I think really started to work that I think one of the big projects that really kicked off that movement was deep dream. So, if you might recall a couple of researchers and Google started producing these really psychedelic looking images that were actually of the original purpose. For that was not art. It was actually for trying to visualize and understand neural networks a little better.
9:43
And but it had this kind of side effect is making really crazy trippy looking art and some of the yeah, I would say Alex Morgan said who's a researcher at Google. He pioneered deep dream and Mike Tycho was around back then and and then things Gans, work more or less became applicable for this making images around the same time 2015. And yeah, I was pretty plugged into the deep learning world, but there's a lot of younger.
10:13
Researchers and Twitter at that time, you know, it's a new thing back then. And so I found out about Gans through them. But at the time there was certainly no artist using them. So you have to be plugged into those circles, take to find out what was going on. And yeah, my first project with Ganz was in 2015, and there was that there was a few people experimenting with them, but I think it started hit the art World more generally in 2016, and start to get pretty big in 2017. And yet with Robbie story, really,
10:43
That into the spotlight. And I think that was how it made its way into crypto. And from there. It's just been been exploding ever since this year, especially because of all of the text image stuff. It's really there's just a whole new generation. So it's been pretty
10:58
fascinating. Yeah, and then I'm curious when you see the text image stuff. What are you referring to there?
11:03
There's been a lot of research, you know over the last few years into generating images from text inputs. No, you know as a
11:13
Really imagine, you know, you're only input to a program is some sentence or phrase and then a, you know, an AI so to speak or machine learning algorithm. Produces a an image from it and that's been around actually, like I've been I can recall. For example, one variety of gam called attention again, that was trying to do that. Maybe I would say two or three years ago, but it was pretty compelling. This is the first year and I would say open a I really
11:44
Because they published a blog post. I think right in the beginning of 2021. So probably around this time last year introducing two models that they had trained to very large models. The dolly and clip dolly is basically like an image producing version of G PT 3, which is they're very famous. Now text language model and clip is clip is a little bit harder to explain but it's basically a model that connects images text and they released they actually published some of the weights for clip the the actual
12:13
Well, maybe not their biggest version, but they published their version of it. And so, you know, whatever it whenever, you know, a large company publishes a model which would otherwise cost millions of dollars or something like that to train yourself. A lot of people like me pounds on it and start to experiment with it. A number of really, really creative researchers. Engineers artists have been experimenting with it over the course of this year, making art using clip with in combination with different kinds of generative models.
12:43
Produce images and and yeah, I've been participating in that it's been. Yeah, it's been a really amazing year. The latest results that I've seen from open the, I just a few weeks ago. They have a basically a new model that's going to supersede superseded clip and Dolly is almost you just won't believe it. Once you start to see it in everyday applications. You won't believe that it's made by an algorithm.
13:09
I would love to unpack that a tiny bit in that Technically when I think of Gann and how it functions and some of the work that I've seen done to date by artist is typically, people going in be using some of the Nvidia code or and essentially, you know, getting very large data sets of images. You have 10,000 plus and training something on there. If you're asking a question from text to image. How does that a, I produce those results without having a 10,000 image data set of the question
13:37
being asked.
13:39
Yes, it does in a sense. So clip, for example, the way the clip works is that it takes as its in, so it was trained on a very large data set, which consisted not just of images, but of also, textual descriptions of those images. So labeled actual text descriptions, they, so you might get an image of a cat running through a yard or something like that and then somebody somewhere labeled it a cat running through a yard. And so you have millions and millions of these images and so clip is
14:08
Trained in order to basically grade the extent to which an arbitrary image and an arbitrary text input are the same. They so if you have a text Cat running through a yard and an image of the same thing, it should give you a high score. And then if you have a completely irrelevant, text input, it'll give you a low score. And so basically, what you could do is use clip as a guide, basically in combination with another model, which simple which all it does is it produces images and
14:39
You can just run almost like a training process. It's a, it's an optimization Loop where you continually change the image, just a little bit to make clip more. And more believe that the image that you're producing is of the text input that you've chosen as the target. So that's the back of the envelope way of describing it. Although. Yeah, there's definitely a lot
14:58
more. Why would an artist want to use something like text to help them? Create a piece of artwork versus just going in and building there.
15:08
Rone data set of images that they like to work with.
15:12
Well, I think I would say, I suppose for everyone. It's a little bit different. I'm for me. For example, I'm just fascinated a little bit by the technology itself and and also not just art applications, but I'm broadly interested in what else can this apply to? And so I use art as a vehicle for her kind of you know for experimenting with these Technologies, I guess more generally I would say text. It gives you
15:39
More control. If you're using Photoshop for example, to an artist using Photoshop or they have to go through these kind of manual procedures with a lot of tools to to get the results they want and with text. They just have a much faster and richer interface to controlling these because you know, this yes is the most natural way for humans to communicate and so it really makes sense that I could see this evolving to not just text image.
16:08
For the sake of art. But text becoming the default interface for computers. You have for tooling and the descends
16:14
right got. There's there's I got totally get that and there is a world where in the not-too-distant future you launch Photoshop and there's that, there's a text prompt at the top
16:24
and you have an image that is
16:26
up on the screen and there's some kids in the background and you can say remove the kid on the left or change the color of the decking to Yellow. Oh, that's a little too much dial it back up a tad.
16:38
All those that it could just be things that you type in a hit enter. And then the AI goes to work on the image. Is that kind of what you're envisioning?
16:46
Yeah, exactly. Yeah, things like that. It make it and it makes it also a lot more accessible because now, yeah, I think it takes a really long time. I still can't use Photoshop. I have to be on this, like, it's just never been in. I haven't really gotten so good or any of those tools and so, but, you know, I can speak, so I can imagine that this just opens up the ability to manipulate images.
17:08
Has to a whole lot more people.
17:11
Hmm. That is so cool. Yeah, and if I could also just chime in on the way. I see that progressing. I think you hit the nail on the head, Kevin making more intuitive tools because I feel like there's so many super creative people out there who aren't necessarily coming. They're all coming from distant or different artistic backgrounds. So they may have this amazing Creative Vision, but they don't have the technical tooling or the artistic background, too.
17:38
Make that Vision a reality. However, like as the tools become more intuitive and you're working a eyes, you can see just like even long-form media, amazing things that you've never even were thought possible. It's just cuz these Super Creative people who didn't necessarily have the technical skills. Can now participate such a great way to frame it. Yeah. For me as a product developer someone that's built products for for my entire career. I've never been
18:08
I've always been the kind of overarching, like idea person where I would wireframe things. I talk when I sit down with the designer and we both agree that we're better together than independently, working on our own like riffing off of each other, but I never had the skills to be able to translate what I was thinking into actual pixels on the screen. And so there's something really powerful about that. You can imagine that this unlocks a whole slew of different fields included create including the creative writing as well. Just like how
18:39
Fantastic, will this be for people that don't necessarily they have big Vision like for a novel or something? Want to go after but need a helping hand with AI to fully flesh something out. It could be really cool. Yeah. For sure. And also custom personalized content. And right now, you go onto Netflix, you get a recommendation for a show because they've seen what you've watched and you've liked in the past, but imagine a future where instead of just picking from
19:08
Um, the existing catalog it's actually created on like a custom binge worthy TV show based on your interest is created on the Fly for you to enjoy.
19:20
That's Insanity. Yeah, and you know really are terrible.
19:24
Yeah, you definitely difficult to pull yourself away from the TV. Yeah. We just had settling
19:28
you and you're like, why I can't I turn away and it's meat.
19:32
There's more right? Yeah, you definitely want a like a non closed-loop interface for something like that so that it can't be bad.
19:38
Reading your neurons in real time, and like, getting you more and more addicted to
19:43
something. That's crazy. Well, let's go back to Brand
19:46
drops because like, I'm excited to hear it. When I went to bring drops to the first time. I think it was penned are who I saw that was doing a drop on there. I can remember, who our mood was Claire where I just was like, wow, if this level of artists interested in what you're doing, you guys are building something special. So I'd love to hear what is the vision when people think brand drops. What do you want them to think?
20:08
Like what is this going to be? What type of platform is just going to be really when people think about brand drop sole what we want them to think is a ir. And this is the platform or the home for where we see the beginnings. And then also the evolution. Like we want to be super flexible because I think the art form itself and the tools around it are growing at such a rapid Pace that we want to be able to grow and evolve. So, in the initial stage,
20:39
It's really was it's really purposeful to pick the three artists that we did. So Gene is Project. Number one, and Gene obviously comes from the super technical side of things. And while he was at slightly later to get into in ft's then like Robbie. He's been working on Ganz for since Inception. So really wanted to as co-founder of the platform's part of having three people drop on the three-day was my personal.
21:09
Like Nostalgia with our blocks and wanting to give a nod to our blocks and there was Gene. And then in the middle and with the technical on the technical side of things was is pindar. He's working with a lot of tech. But then also it's him creating his own data sets. So he can make these little creatures that are all very recognizable. Part of a cohesive set and work in Rarities as somebody who's involved in crypto and just like
21:38
it truly gets the concept from a collector's perspective, which is something that I feel like I also am coming from a collector's, the background. So I'm really trying to work that into the collections, having this collectible feel where there's Rarities and then there's more pieces that also looks super dope, but they maybe just aren't like aren't as rare as other items. And then there's Claire's pieces, who eclairs coming from a less technical.
22:09
Background. But this thoughtful just hours and hours years, really, probably curating and working on putting a collection of pieces that really resonate with her and resonate, with the people that she really interacts with and, and engages with on a daily basis. How did she use a eye? Because I'm familiar with Claire's work that I'd be curious that of these 500 that that were mented. Her Genesis drop. What was the application of AI to her artwork?
22:38
I'm pretty sure you'd have to ask her but I'm pretty sure the bulk of what she did were iterations through through ART reader, which is an open. It's an open tool for you know, anyone can use it and then she created this created a narrative around, a very particular aesthetic and horse stance on AI collaborates youth gets you coat. Feels like she calls herself a AI collaborative artist and I mean really?
23:09
I think there's levels to obviously there's some human interaction right now going on with all a are to a certain extent. So it's really a i collaborative art, but I think she really identifies with the term AI collaborative artists because she feels like and she says that herself are quote is you know curate taste is the new skill. So like her as the curator that's like the way she sees her role in cultivating, the a collection.
23:38
And it's pretty cool, but I felt like those three were all different and also really think that it's important like going forward because so many people when they think about a ir, and I think a lot of them see Gans that come out that have very similar Styles because that they aren't really customized as far as like how the data that's been used to train them. And so going forward, the way really providing as many collections as
24:08
Well, that's that have super diverse Aesthetics is a big thing that we want to do. We also want to shine a spotlight on artists and researchers who have been working in the field with really no one paying any attention just in the background, just doing it for the love of it. And then also like the way you think that it'll evolve. So in the very beginning we have I think they'll be three different categories. So right now we have artists curated.
24:38
Where the pieces have been selected by the artist, there's lots and lots of output and the artist decide which pieces are going to be part of the collection and then we're working in. So, Gene has this other sister project and we'll get into that more in a little bit. But like basically we're going to work in the pipeline that through that project to enable a more the plan is to work that into our platform as well so that we can provide a more.
25:08
Namek process, but there's like on-the-fly minting. So basically, to give an example, so it would be like fully a, i curated where the artist actually doesn't have doesn't play a role in the curation, but they have trained the models on a particular data set and then the person is able to click the button to Mint and then piece comes out and then everyone sees it for the first time crazy, very similar to our box.
25:37
He has so that that's is that when do you have that
25:39
planned or slated to go out? When is the first time you'll be doing a live? Because you're going to be asking in some sense. The you're going to be asking the machine to actually produce that image for the very first time. That's I don't think I've ever seen that done before on the AI side. Is that right? Correct. Eponym has a name is done. Done. It Gina is already doing it now currently with Abraham if you go to Abraham AI people can get
26:07
Against on the did the Abraham Discord to participate in that and they can actually type in whatever words they want. And then Abraham spits out an artwork based on what the input
26:18
was. The idea with Abraham. This is a project that I've been working on for a couple of years actively, but with the text image stuff worked in this year and the idea there is that you there's a bunch of ideas there. But the one that we're talking about is that you can actually generate the image on the Fly, you know, so you could
26:36
Could you can Embrace that and actually give that power to the mentor to customize their work? And so the the plans for Rolling that out with brain drops, what we don't we haven't set a date yet because we're still actually developing it. I we have a basic demo of it, working on Abraham and now we're doing things like you a and just engineering it and then trying to develop a way of integrating that into other projects. Brain drop specific so that but it'll certainly take us a little bit of time to to integrate that into the website.
27:07
Is the website right now, it's basically just we the artist produces the assets ahead of time. They pre curate them and they're just static assets. And so changing that to be dynamically, generated will take some work to get it, working on the website and also, it would change the contract a little bit because you don't actually know where the images before it's produced depending on how you do it. So, I would say that we're cautiously optimistic, that we can do something like that in q1, maybe Q2 depending on how things go.
27:36
Go and whatever unknown unknowns to come up between them. But it's for me the most exciting aspect of this is to take advantage of the technology. The things that this technology specifically really lets you do. Like you can't do this with other kinds of digital art. You can't even really do it with other kinds of generative art like with art blocks. I think they get close because each each artwork is generated. Dynamically, at least four for a particular person at a particular time, but for it to also allow the mentor to make
28:06
Imprint on it is something that I think is a unique faculty of AI. And so that's where we're hoping to really
28:14
innovate. You have for people that are listening. They have to go check out Abraham day. I / creations and we'll link this in the show notes. This is some really creepy / awesome / on all of the things people are putting in these phrases, and some of them are just really amazing at what the output is by the AI other times. It's a little bit odd and then it's just, it's really
28:36
Really cool.
28:38
Yeah, we have a community of people that are interacting with it. We haven't actually, we haven't created enough keys yet. We're still just like I said in development, but there are people excited about the future and we've promised that we will tokenize. And so that's something that's also on my agenda for the year. And but at least the tool itself to making Creations it is working. And so if you if you go to us and get on to this cord and let me know that you're interested, you can get these free quote unquote tokens. That tokens are actually, they're not crypto tokens.
29:07
They're just like generator permitting, permissions to use the generator because we can't make it fully open just yet because it's all gpus. And so it would basically while it'll be a disaster if we did that. But hopefully in the future, it'll be pretty open.
29:22
Very cool. Yeah. And just real quick just to touch on the three categories, so that we have what we're working on now, which is the artist curated stuff. And then it would be the a, i curated stuff where there's nothing other, you press meant.
29:37
Then you an artist has trained, the models. They've decided what the potential outcomes could be. But the exact outcomes would be totally determined by the AI and everyone would see it at the same time, including the artists, and then there would be a third category which would be Community curated. So, say the community is it's like what's going on already with Abraham, where somebody can type in phrases, they can really be even more part of the art other than just their transaction being part. They are
30:07
I actually like able to put in words or they're able to and then if they don't like the out, but they don't have to meant it. There's lots, we want to be really flexible. Like I said, we want to evolve all the potentials that can be done. Yeah, it's really cool in that you think of it there. Just at some point they're going to be like, Lego pieces that you can plug into the eye and some sense you could say, okay, as an end user. What do you want your input to be? You want your input to be an oracle of the weather, where you're located, or do you want the input to be of?
30:37
Correlated sports team or do you want to be a phrase that you type in like you can play with this in so many can slice and dice in so many different ways. Yeah, the potential and I think that's one of the things that I find exciting about a. I'd is AI art is that it just feels so Limitless, just a i itself. Yeah, on your Twitter page. You mentioned that you're doing at least for the curated piece initially its 500 to 1000 pieces per collection at Point 1 E that they always going to be point one eith.
31:08
That's the plan for now because things are so early in a. I aren't, there's not like we're trying to sound a spotlight on the on these new artists. And obviously the idea is for the collectors and the artists to all grow together. So we feel like it's important to offer relatively, you know, reasonable prices, so that everybody can get in and then grow together as the space grows hate. I don't like to, like, use a hate to use like terms like forever and because
31:37
And it like pegs un, but for the foreseeable future it certainly seems like that's like the direction. I think they were heading. There are a couple people. I don't necessarily want to name their names. Most people would probably guess who they are. Who are interested in doing drops with us. We may have met make exceptions just because they're more notable and they've sell our for quite high amounts of money. And so it may not be somewhat so fair for them if they
32:07
Did do a point one collection, I guess we could maybe tweak the supply a little bit. We're also playing around with some other drop mechanism Dynamics which could be interesting. So maybe like they would release a smaller quantity collection and it could be a little bit more, but we'd also make it like a lot harder for people to be able to participate in the drop. The new contract is going to have some different little things that we can do. It. Don't get into too many details, but there's some things that we can do.
32:38
It can make it harder for like Bots to get pieces, which is always what's been an issue for us. We've only had two drops but drops have gone super fast. The last drop is like one second. There was some nesting that went on. So we've addressed that it really is only one. One minute per transaction have used pretty, mint it all over. You seem pretty mint.
33:01
I know pindar has used it for, he was doing. I like a private, big can whitelist and it's basically just a limited spots from drops. Its it makes the users authenticate multiple things like their Discord and their Twitter and it looks how long of the counts of been around in a bunch of different things. So it's a really good kind of like bought elimination tool, right? Yeah. We're kind of, we're exploring stuff like that. We're doing one thing we are doing for this next drop. So our next job is going to be I don't know when this podcast will come out, it says today is the third. We'll have our next.
33:30
Up on January 5th and 4th at drop. We were actually in for the future drops for the foreseeable future. We're going to let if you have and we announced this last night, so if you have, if you own a piece from at least, one piece from the first three, drop the first three collections that were dropped on day one. So brain Lou Genesis and pod, Gans. If you own one, each of those, you can meant peace preeminent at peace, just one piece before.
34:00
Or the it opens up to the public. Oh, I understand though. Is that one piece per type, or is it like if I own one of these? Do I get two men three, it'll be no. So it would be 1 / wallet that holds all three. Okay, so goddamn and you but you don't have to hold all three. You can just hold one of them. Yeah, you need you do need to own the full set. Okay, on the field. Last into one wallet. Okay. When do you do the snapshot for that? By the way, so we're not doing a Snapchat. We're going to do real live. People continue to acquire.
34:30
Set so that they can so they can acquire a set up to the time of pre menteng, and that way it gives everybody a chance, if they want to participate like that, that they can do it. Awesome. And it'll be the same way, you know, like, for the future drop. So we don't have to do like a snapshot. I love that. That's like the way to go. It's so funny to me, that we have all the data on chain in real time at any point. But yet there's so many people have been like it's time to do a snapshot. You're like, well, how about we just look at the wall at when we're doing the transaction denied denied again.
35:00
Doesn't
35:00
have it. It is just code that can be written the checks that, you know,
35:04
yeah. For sure. And I just feel as a collector and I try to look at things from collections, its perspective. And it's something that I just feel like is important. The early collectors are so passionate and I really love our community. It's we got about 1,200 people and as collectors in the community right now and it's pretty tight-knit group, and know there's a lot of people who really want to continue to collect a full set and as things get more and more popular just gets harder and harder.
35:30
Obviously, these days are different than like the early, our blocks days. It was easier to continue to as a super fan to continue to keep your full collection or your complete collection and tax, but now it's just there's so much demand, especially from like flippers that pushes those people out. Obviously, there's still the secondary Market but this is a way to reward the people who have were believing in us from the beginning and want to participate and are here for the long term.
36:00
That's awesome. It's
36:02
it totally makes sense that
36:03
this product needs to exist, given the rise and a, I are you all probably the same that I do and that this is such an early innings for this whole space. I have a feeling that it, the next 10 years are going to be explosive or AI. Yeah, I can't imagine how well it's going to get a synthetic media. I mean, I think there's so much AI going on. Angie knows, way more than me. But of what I know about AI going on behind the scenes in media companies already.
36:30
Eddie. Yeah, I just it will be fascinating to see the evolution of this. And we really like, I say, over and evolving with things as they continue to expand and grow is what we're really aiming to do. We want to keep you want to be selective and aren't necessarily trying to put out tons and tons of artists especially like in this first year, we're being pretty selective but it's still invite only. However, I do think we've been talking about putting out a, an application pretty soon because
37:00
There's a lot of people that are doing cool stuff that don't necessarily have a big social media presence, but they're just doing rad stuff. Like they're creating the, they're smart and so many ways. And we just people that are just like brilliant in there. Maybe they could, they can make these massive data sets and like and and Unreal Engine. And then take those massive datasets and then put them through models and just make super cool stuff. And so we're open to
37:30
Anybody who's doing something like really rad. Like we want to know about it and I helped introduce them to a bigger audience. Yeah, I'm curious to a question to both. You is probably a great time to ask this. So you have a handful of artists that just a very small number because you're just getting off the ground, but are there other a artists that you look at? Are you look to and the space that are doing this new kind of bleeding edge work in this field, that that you think are smaller that people may have not heard of yet, that deserves some attention.
38:01
There's distance. I've been since I've been tracking this for a long time. I've become good friends with a lot of people that are in this world and not everyone's particularly. Well, well, known and I would say even this year, especially there's, there's kind of a new generation going on because I think nfc's have a lot to do with it. Now. There's actually, you know way to to to earn for this kind of work whereas before there didn't used to be. And so we've gotten we've become acquainted with a handful.
38:30
Full of pretty young or relatively obscure, people who are for working with the technology trying to make art and hopefully shine a spotlight. I think to me The Logical conclusion of this is that there's a new industry, kind of bubbling. I think the same way that in the early web was all static. And then the big change was Dynamic, user-generated, kind of dynamic content. I think there's going to be another another transition to to
39:00
Content that's generated on the fly. So I think 90% of the content in the internet metaverse gaming is going to be on the Fly generated for a specific context, maybe for a specific person at a specific time. Very transient the ephemeral and I think there's just going to be a lot of four orders of magnitude more a i generated content on the internet than there is now and yeah, we'll see where it all evolve. It's definitely hard to imagine. But because it seems like such a big
39:30
Thing now, in my world. That's the biggest thing. So it feels like there's a lot of clamor, but I have to, I remember thinking like, oh wow, Ganz are so big now that was in 2017 and then I couldn't even imagine 2020 2021 how that would play out because our worlds up here so much bigger to us than to people in other niches cryptos, still so early. And this, this is still pretty early. I think it's just going to, it's in the good position to keep growing.
39:57
And in gene when
39:58
you ingest, when you're finding these new artists that you want to work with are ones you want to have on the platform. Do you believe that? Is it something where they're still independently tinkering? Like they're firing up their own easy to ances. They're installing again software. They're running all this on their own or do you see that kind of going away? And it's just being more of about a mainstream bigger models and the creativity of how
40:27
They interact with and combined models through tooling or is it going to kind of transition that way? Meaning? Is there a world where open a.i. And some of the others like own the tooling and you don't even have to as an artist think about ever firing up. There are training your own data set. Where is that currently? And where do you think it's headed? I see it as a three-pronged, there's there. It's like there's these three prongs, right? It's like the people who are super technical and they're just amazing at doing all the technical stuff. There's people who just put out like
40:57
Unbelievable Aesthetics. And then there are people who are just amazing self marketers and they are out there and they are they just know what they're doing in terms of like building relationships, making a name for themselves. And so I think also like the ultimate like the perfect soup is like to have all those things together. And so it's a balance I think figuring out figuring out, does the artist wear it? Like where is the artist may be lacking in these on these things, but then also,
41:27
I think it's to the way especially since we're being so selective like early on, I feel like it's my family and so we're all supporting each other and like its and wanting to grow at the same time all together. So, yeah, it's an interesting balance.
41:43
I think I agree. It'll be a balance if it'll probably be a mix of things. But certainly there are the pipeline has multiple stages. Let's say, and I can imagine that some of those stages.
41:57
Well, for a while, probably have some interaction with cloud services. Are certainly open the eye but also you can imagine Microsoft and Google Chinese companies also are coming out with G PT 3, Type cloud services, and those things are very hard actually to replicate as a as an individual. One thing. I'm really interested in. There are some Grassroots initiatives to create. I wouldn't necessarily say decentralized but more open, more open source and and permissive.
42:27
If, you know, alternatives to open a, i, there's one group that, that I'm pretty well acquainted with it. Looser AI, which is this Grassroots initiative to train open source versions of GP and another kinds of really large expensive, deep learning models and make them available broadly. And, and those, they don't necessarily limit to your creativity. Those models. They become like they become, you know, this new Baseline for all of us. We don't, for example, go. Oh, we're
42:57
Walked into Unix or something like that, but just already layers deep and these models they're so versatile. They can be used to create the things that that have almost nothing to do with each other except that they have some some dissent from these models. And so I think those models will definitely be will certainly, you know, a lot of us will be using the same models for a long time. But then there's a lot of staging and tooling around that can that can very much be the initiative of artists and should be the initiative of artists trying to trying to innovate push.
43:27
Injuries. And I think for the, in the future, you'll see these things
43:30
interact. Yeah, it's interesting. You say that the people are creating Alternatives and more open decentralized versions of open a.i. When literally their name is, oh baby.
43:40
Yeah. Yeah, you would think they would be the most open, and that's funny.
43:44
Awesome. This has been a fantastic overview. Are we, are we missing anything here? I think that my take home. Is that the platform right now? The first one I've seen that, this is going to be doing this type of work on AI.
43:57
And I think our blocks is a great kind of proxy for how people can think about it. But for AI, if people want to to get involved another, you have a Discord, that's probably the best place for people to go. I would imagine. And then, would you say that's the place that you're learning? Can people learn how to actually create this style of art through their Discord of the area as well, or is it just for collectors? Well, actually ml for a like Jean touched on a little bit earlier as educational resource. Then he has worked on for a long time and we have links to that.
44:27
And the Discord, you can our Twitter's at brain drops underscore art. But, yeah, I know we, we, we have plans. We have our hands full obviously, right now. Just, we're not even two months in, so really obviously super busy and making sure we get the artists that we want and also helped them. A lot of them are not coming from an mft background. So kind of helping them figure out what a collection looks like with their workflow and how
44:57
Could maybe work in Rarity traits and things that are fun for collectors and also like trying to represent like so much of the Cool Tech that people are working on because some things are progressing so quickly. It's just wild to see some of the stuff that people are doing and the end. Even the people that are creating the tech that some other artists are using those. The people that are actually like originating the technology don't necessarily want to be for whatever, you know, personal reasons, IP rides, whatever they don't necessarily want to
45:27
To be the ones to release the work but we touch base with them and make sure that they're cool with their almost prentices. Like releasing a drop with us. So it's just been yeah, it's been wild like to see. And I think people are like, super open to sharing their processes, the kind of stuff that they work on and, and if you go into any of the artists channels, so obviously there aren't a lot yet there. We only have four artists that have dropped so far. We have the fifth coming up on Wednesday, but
45:57
But if you go into the artist Channel as you can talk to the artists and they're very open about like their processes and how they go about doing their work. So I feel like we want to keep it. Yeah. Open and let people play around with the tools because because like Jean says like there becomes a baseline, but then it's about. Okay, how do we just keep pushing the boundary? Yeah. Awesome. Well, I'm so stoked that there's you all exist in that there's something for this community
46:22
because it's to me, it's been
46:24
I've heard about bits and pieces.
46:27
In Falls, pinder's work, and fought a bunch of the RC already working independently, but there's no Ben, no centralized like home for a, I work, and it's great that brain drops his creating that, that's awesome. Well, thank you so much for being on the show and that will link up. Obviously, all the things we talked about today and get people into your disc or to learn more. Thank you so much. Kevin, man. We really appreciate it. This is awesome. I'm a big fan of yours. So it's an honor to be on here. Really
46:53
is, thank you. Yeah, like just instead I got on there. Thanks a lot.
46:57
For having us
46:57
definitely have to have you on as you come up with more of these features and functionality because it is very early days and I'm excited for the roadmap. You've already laid out there. It's going to be fun. Oh, yeah. Hell yeah, awesome.
47:09
All right,
47:10
that is it for this episode. Thanks so much for tuning in.
47:12
If you would like to help us out head on over to prove
47:15
got XYZ
47:16
and click on the reviews, but at the very top and leave us the
47:19
five star review. Thanks so much. Take
47:21
care.
ms