Stable Diffusion: Worldbuilding with 3D and AI Using MVDream - Part 1
Offered By: kasukanra via YouTube
Course Description
Overview
Syllabus
Intro
MVDream: what is it/what does it solve?
Dataset Overview
Camera settings explanation
Multiview perspective
Multiview 2D code dive
Camera embedding
Camera utils
Setting up MVDream 2D environment
Trying out the Gradio server
Start of MVDream-threestudio
Setting up Docker environment for MVDream-threestudio
Explaining why the gradio server for 3D is not usable
Generating NeRFs through CLI
Exporting meshes
Evaluating MVDream mesh fidelity
Second stage refinement and why I don't recommend it
Redesign from refinement = unusable
Showing some other NeRF to 3D mesh objects
Rendering out a 3D object
Using 3D renders as ControlNet guides
Worldbuilding overview context
Potential room designs
Potential chair designs
Generating albedo map texture in ComfyUI
Using Adobe 3D Sampler to convert albedo into PBR textures
Quick setup of converted PBR textures
Using same process to generate metal textures
Quick overview of using Cube by CSM to convert a picture to a mesh
Checking refined mesh from Cube
Closing thoughts
Taught by
kasukanra
Related Courses
Learning Industrial AutomationLinkedIn Learning Learning Industrial Automation
LinkedIn Learning Ultimate RunPod Tutorial for Stable Diffusion - Data Transfers, Extensions, CivitAI
Software Engineering Courses - SE Courses via YouTube ComfyUI - Node Based Stable Diffusion UI
Olivio Sarikas via YouTube Flicker-Free Animations - Stable Diffusion, EBSynth & ControlNet
Vladimir Chopine [GeekatPlay] via YouTube