Sr. Bizops; GTM B2B Team @ Replit Formerly founder Europa Labs, B2B Storj.io, Microsoft
Sr. Bizops; GTM B2B Team @ Replit Formerly founder Europa Labs, B2B Storj.io, Microsoft

Subscribe to Kevin Leffew

Subscribe to Kevin Leffew
Project Heisenberg - Kevin Leffew - Medium
Originally written for 2018 Georgetown Hackathon, which we placed 2nd at, here: https://medium.com/@kleffew/project-heisenberg-cacd0c329cf7 * Tokenizing prescriptions on the blockchain***Project Heisenberg **is a decentralized identity management and pharmaceutical ERP system built atop a permissioned Ethereum consortium network. This project implements a non-fungible token standard to completely eliminate the risk and *imlicit *cost associated with pharmaceutical script fraud. By creating a ...

Designer Jewelry, On-chain
Bringing tokenized jewelry redeemables to the world of NFTs:Digital Twin is hosting the first of its kind Trunk Show in Miami, FL featuring some of the most notable jewelry designers around the world - from Miami to Hong Kong, and Ukraine. Each designer—Sazingg, L’Dezen and POCHE—will showcase their asset-backed NFT collections and bring their fine-jewelry designs on-chain to ethereum. The goal of this Trunk Show is to blend the digital and physical experience of purchasing fine jewelry. The ...

Dream Booth fine-tuning utilizing a Decentralized Object Store
Submission for the 2023 Hugging Face Dream Booth Hackathon by Kevin Leffew To follow along step by step in a notebook, check out: https://colab.research.google.com/drive/1EnqpDiKOVYhR0c6f4CgmDg2zqcbYZJpB?usp=sharing The dataset and card for this model (bethecloud/golf-courses) is located here: https://huggingface.co/datasets/bethecloud/golf-courseA Quick OverviewFine-Tuning Stable Diffusion for mythological Golf Course Images with Storj DCS Stable diffusion is an extremely revolutionary and p...
Project Heisenberg - Kevin Leffew - Medium
Originally written for 2018 Georgetown Hackathon, which we placed 2nd at, here: https://medium.com/@kleffew/project-heisenberg-cacd0c329cf7 * Tokenizing prescriptions on the blockchain***Project Heisenberg **is a decentralized identity management and pharmaceutical ERP system built atop a permissioned Ethereum consortium network. This project implements a non-fungible token standard to completely eliminate the risk and *imlicit *cost associated with pharmaceutical script fraud. By creating a ...

Designer Jewelry, On-chain
Bringing tokenized jewelry redeemables to the world of NFTs:Digital Twin is hosting the first of its kind Trunk Show in Miami, FL featuring some of the most notable jewelry designers around the world - from Miami to Hong Kong, and Ukraine. Each designer—Sazingg, L’Dezen and POCHE—will showcase their asset-backed NFT collections and bring their fine-jewelry designs on-chain to ethereum. The goal of this Trunk Show is to blend the digital and physical experience of purchasing fine jewelry. The ...

Dream Booth fine-tuning utilizing a Decentralized Object Store
Submission for the 2023 Hugging Face Dream Booth Hackathon by Kevin Leffew To follow along step by step in a notebook, check out: https://colab.research.google.com/drive/1EnqpDiKOVYhR0c6f4CgmDg2zqcbYZJpB?usp=sharing The dataset and card for this model (bethecloud/golf-courses) is located here: https://huggingface.co/datasets/bethecloud/golf-courseA Quick OverviewFine-Tuning Stable Diffusion for mythological Golf Course Images with Storj DCS Stable diffusion is an extremely revolutionary and p...
Share Dialog
Share Dialog
<100 subscribers
<100 subscribers


Artificial intelligence text-to-image tools like Stable Diffusion, Midjourney, and Dalle2 are rapidly unlocking new possibilities for memes, marketing, and predictive learning for advertising.
Stable Diffusion was officially released into beta on August 22 as an open-source alternative to Dalle-2.
Because it’s training parameters are open-source, anyone can access and play around with the weights used to train the machine learning model. Stable Diffusion does not contain the restrictions on political or sensitive content like Dalle2 does, so users must leverage the model at their own risk.

As such, these open-source models are especially well suited for web3-based decentralized cloud storage models, like Storj DCS.
Storj DCS is a decentralized object storage solution where data is encrypted and erasure coded by the client – and then distributed across a network of uncorrelated nodes across the word.
A “network of uncorrelated nodes” means resilience through diversity – due to its unique architecture, data stored on the network can’t be withheld, censored, or “held hostage” by any one individual, company, or state actor.
As part of its web3 stack” – this decentralized cloud storage protocol has replaced Amazon S3-backed solutions with Storj DCS to act as a distributed, decentralized, and multi-cloud IPFS pin, worthy of the “web3” moniker.
Storj breaks each file into 64 mb segments, of which any 29 of the total 80 distributed pieces can rebuild the file. This means, that when you stream a file, instead of streaming a data single thread sequentially from a datacenter like AWS US-EAST-1, you are parallelizing it into multiple, concurrent threads.
This method of decentralization + erasure coding + parallel packet streaming has a number of advantages vs pinning to a centralized datacenter like AWS.
Critically, rather than relying on single service provider, Storj facilitates a competitive, open, and global market for saturating bandwidth – where the fastest responders globally will recreate the file at the source (client-side). In addition to better resiliency and uptime, this quality of decentralized cloud storage also means better performance, everywhere.
In this developer walkthrough, we will utilize this cutting edge AI tool to pull an image from a bucket on the Decentralized Cloud, transform/diffuse it, and upload the result back to a new bucket.
The code for this solution can be found on Github here: https://github.com/keleffew/decentralized-diffusion/blob/main/Img2Img_Stable_Diffusion_on_Decentralized_Cloud.ipynb
The weights, model card and code for the Stable Diffusion model can be viewed here: https://huggingface.co/CompVis/stable-diffusion.
To get started, let’s first upload a base image, which we will transform to the decentralized cloud.
This can be done via the command line toolkit (https://docs.storj.io/dcs/getting-started/quickstart-uplink-cli/uploading-your-first-object/), or by using the Web Portal App at storj.io/login.
For this guide, I created a sketch of a bear, using the best of my artistic talents, and uploaded it to the decentralized cloud. You can see the results below

The image is sharded and dispersed across the decentralized cloud. We can find the realtime node distribution below:
Sketch Image Source: <https://link.storjshare.io/s/jxtuohe5sssowgr4dutpq32xgy6a/machine-learning-test/BearDrawing.jpegSource: https://link.storjshare.io/s/jxtuohe5sssowgr4dutpq>
After uploading the file, grab the link from Storj linkshare and add a /raw/ prefix – indicating a direct download.
Example: https://link.storjshare.io/raw/jxtuohe5sssowgr4dutpq32xgy6a/machine-learning-test/BearDrawing.jpeg
The S3 Gateway lets you use existing AWS libraries (like Boto3 for Python) to upload data directly to the decentralized cloud.
For the codebase linked to in GitHub above, we will need to input the S3 credentials, including an access key, secret key, and gateway credentials. There are also docs to accomplish this, located here.

In this developer walkthrough, we will utilize this cutting edge AI tool to pull an image from a bucket on the Decentralized Cloud, transform/diffuse it, and upload the result back to a new bucket.
Navigate to: https://colab.research.google.com/drive/1hyHOBsy9UqS79vewLjcmBXeYTBkM6qpK#scrollTo=uMqbvbHGR6O8 and start to run the code blocks as described.
Run Steps 1-2, in the Jupyter worksheet importing the relevant dependencies
To access the stable diffusion model, you will need to create a hugging face account (https://huggingface.co/ and create a User Access Token for importation into step 3.
from huggingface_hub import notebook_login
notebook_login()

In the final Step, enter the relevant Access Key, Secret Key, and Gateway Endpoint config information.
#Export img to decentralized cloud
# Python Kit for S3, upload to Storj
import boto3
# Pull S3 credentials from Storj.io, for config docs see: https://docs.storj.io/dcs/api-reference/s3-compatible-gateway/
s3 = boto3.resource('s3',
endpoint_url = 'https://gateway.storjshare.io',
aws_access_key_id = 'ACCESS',
aws_secret_access_key = 'SECRET')
# Upload a new file
data = open('brittish-gosling.png', 'rb')
s3.Bucket('machine-learning-test').put_object(Key='test.jpg', Body=data)
Now that the configuration has been updated, we can run the script, and an image will be pulled from the Decentralized Cloud, transformed, and then re-uploaded in parallel.
Let’s check out our result:

Congratulations!
You have successfully used Stable Diffusion with Storj DCS, storing the data transformed in a decentralized manner on web3!
Kevin Leffew is the Chief Product Officer at Europa Labs, a software company building web3 applications and infrastructure for companies, creators, and brands.
Artificial intelligence text-to-image tools like Stable Diffusion, Midjourney, and Dalle2 are rapidly unlocking new possibilities for memes, marketing, and predictive learning for advertising.
Stable Diffusion was officially released into beta on August 22 as an open-source alternative to Dalle-2.
Because it’s training parameters are open-source, anyone can access and play around with the weights used to train the machine learning model. Stable Diffusion does not contain the restrictions on political or sensitive content like Dalle2 does, so users must leverage the model at their own risk.

As such, these open-source models are especially well suited for web3-based decentralized cloud storage models, like Storj DCS.
Storj DCS is a decentralized object storage solution where data is encrypted and erasure coded by the client – and then distributed across a network of uncorrelated nodes across the word.
A “network of uncorrelated nodes” means resilience through diversity – due to its unique architecture, data stored on the network can’t be withheld, censored, or “held hostage” by any one individual, company, or state actor.
As part of its web3 stack” – this decentralized cloud storage protocol has replaced Amazon S3-backed solutions with Storj DCS to act as a distributed, decentralized, and multi-cloud IPFS pin, worthy of the “web3” moniker.
Storj breaks each file into 64 mb segments, of which any 29 of the total 80 distributed pieces can rebuild the file. This means, that when you stream a file, instead of streaming a data single thread sequentially from a datacenter like AWS US-EAST-1, you are parallelizing it into multiple, concurrent threads.
This method of decentralization + erasure coding + parallel packet streaming has a number of advantages vs pinning to a centralized datacenter like AWS.
Critically, rather than relying on single service provider, Storj facilitates a competitive, open, and global market for saturating bandwidth – where the fastest responders globally will recreate the file at the source (client-side). In addition to better resiliency and uptime, this quality of decentralized cloud storage also means better performance, everywhere.
In this developer walkthrough, we will utilize this cutting edge AI tool to pull an image from a bucket on the Decentralized Cloud, transform/diffuse it, and upload the result back to a new bucket.
The code for this solution can be found on Github here: https://github.com/keleffew/decentralized-diffusion/blob/main/Img2Img_Stable_Diffusion_on_Decentralized_Cloud.ipynb
The weights, model card and code for the Stable Diffusion model can be viewed here: https://huggingface.co/CompVis/stable-diffusion.
To get started, let’s first upload a base image, which we will transform to the decentralized cloud.
This can be done via the command line toolkit (https://docs.storj.io/dcs/getting-started/quickstart-uplink-cli/uploading-your-first-object/), or by using the Web Portal App at storj.io/login.
For this guide, I created a sketch of a bear, using the best of my artistic talents, and uploaded it to the decentralized cloud. You can see the results below

The image is sharded and dispersed across the decentralized cloud. We can find the realtime node distribution below:
Sketch Image Source: <https://link.storjshare.io/s/jxtuohe5sssowgr4dutpq32xgy6a/machine-learning-test/BearDrawing.jpegSource: https://link.storjshare.io/s/jxtuohe5sssowgr4dutpq>
After uploading the file, grab the link from Storj linkshare and add a /raw/ prefix – indicating a direct download.
Example: https://link.storjshare.io/raw/jxtuohe5sssowgr4dutpq32xgy6a/machine-learning-test/BearDrawing.jpeg
The S3 Gateway lets you use existing AWS libraries (like Boto3 for Python) to upload data directly to the decentralized cloud.
For the codebase linked to in GitHub above, we will need to input the S3 credentials, including an access key, secret key, and gateway credentials. There are also docs to accomplish this, located here.

In this developer walkthrough, we will utilize this cutting edge AI tool to pull an image from a bucket on the Decentralized Cloud, transform/diffuse it, and upload the result back to a new bucket.
Navigate to: https://colab.research.google.com/drive/1hyHOBsy9UqS79vewLjcmBXeYTBkM6qpK#scrollTo=uMqbvbHGR6O8 and start to run the code blocks as described.
Run Steps 1-2, in the Jupyter worksheet importing the relevant dependencies
To access the stable diffusion model, you will need to create a hugging face account (https://huggingface.co/ and create a User Access Token for importation into step 3.
from huggingface_hub import notebook_login
notebook_login()

In the final Step, enter the relevant Access Key, Secret Key, and Gateway Endpoint config information.
#Export img to decentralized cloud
# Python Kit for S3, upload to Storj
import boto3
# Pull S3 credentials from Storj.io, for config docs see: https://docs.storj.io/dcs/api-reference/s3-compatible-gateway/
s3 = boto3.resource('s3',
endpoint_url = 'https://gateway.storjshare.io',
aws_access_key_id = 'ACCESS',
aws_secret_access_key = 'SECRET')
# Upload a new file
data = open('brittish-gosling.png', 'rb')
s3.Bucket('machine-learning-test').put_object(Key='test.jpg', Body=data)
Now that the configuration has been updated, we can run the script, and an image will be pulled from the Decentralized Cloud, transformed, and then re-uploaded in parallel.
Let’s check out our result:

Congratulations!
You have successfully used Stable Diffusion with Storj DCS, storing the data transformed in a decentralized manner on web3!
Kevin Leffew is the Chief Product Officer at Europa Labs, a software company building web3 applications and infrastructure for companies, creators, and brands.
No activity yet