'planeGeometry turned into a sphere

I feel like my logic isnt too awful here. I am trying to convert a planeGeometry into a sphere using its UV coordinates as latitude and longitude. Essentially here is the logic:

  1. convert uv coordinates to lat/long respectively
  2. change lat/long over to radians
  3. convert to x,y,z catesian coordinates

heres the code for the vertex shader im trying out:

    varying vec2 vUv;

    #define PI 3.14159265359

    void main() {
      vUv = uv;

      float lat = (uv.x - 0.5) * 90.0;
      float lon = abs((uv.y - 0.5) * 180.0);

      float latRad = lat * (PI / 180.0);
      float lonRad = lon * (PI / 180.0);

      float x = sin(latRad) * sin(lonRad);
      float y = cos(latRad);
      float z = cos(latRad) * sin(lonRad);

      gl_Position = projectionMatrix * modelViewMatrix * vec4(x,y,z, 0.5);
    }

Any advice is appreciated, I feel like I am just missing something small with the logic but I believe in the masses.

Edit:

The problem ended up being pretty stupid, I was just passing the wrong values for the planeGeometry arguments. All solutions here are valid.



Solution 1:[1]

I always use that implementation of spherical coords from three.js in shaders, if I want to morph plane to sphere:

body{
  overflow: hidden;
  margin: 0;
}
<script type="module">
import * as THREE from "https://cdn.skypack.dev/[email protected]";
import { OrbitControls } from "https://cdn.skypack.dev/[email protected]/examples/jsm/controls/OrbitControls.js";
import { GUI } from "https://cdn.skypack.dev/[email protected]/examples/jsm/libs/lil-gui.module.min.js";

let scene = new THREE.Scene();
let camera = new THREE.PerspectiveCamera(60, innerWidth / innerHeight, 1, 1000);
camera.position.set(0, 0, 1).setLength(6);
let renderer = new THREE.WebGLRenderer({
  antialias: true
});
renderer.setSize(innerWidth, innerHeight);
renderer.setClearColor(0x404040);
document.body.appendChild(renderer.domElement);

let controls = new OrbitControls(camera, renderer.domElement);
controls.autoRotate = true;
controls.update();
console.log(controls.getAzimuthalAngle())

let light = new THREE.DirectionalLight(0xffffff, 1);
light.position.set(0.25, 0.5, 1);
scene.add(light, new THREE.AmbientLight(0xffffff, 0.5));

let u = {
    mixVal: {value: 0}
}

let g = new THREE.PlaneGeometry(2 * Math.PI, Math.PI, 100, 100);
let m = new THREE.MeshBasicMaterial({
    map: new THREE.TextureLoader().load("https://threejs.org/examples/textures/uv_grid_opengl.jpg"),
  onBeforeCompile: shader => {
    shader.uniforms.mixVal = u.mixVal;
    shader.vertexShader = `
        uniform float mixVal;
      vec3 fromSpherical(float radius, float phi, float theta){
        float sinPhiRadius = sin( phi ) * radius;

        float x = sinPhiRadius * sin( theta );
        float y = cos( phi ) * radius;
        float z = sinPhiRadius * cos( theta );

        return vec3(x, y, z);
      }
      ${shader.vertexShader}
    `.replace(
        `#include <begin_vertex>`,
      `#include <begin_vertex>
        float phi = (1. - uv.y) * PI;
        float theta = uv.x * PI * 2. + PI;
        float r = 1.;
        transformed = mix(position, fromSpherical(r, phi, theta), mixVal);
      `
    );
    //console.log(shader.vertexShader);
  }
});
let box = new THREE.Mesh(g, m);
scene.add(box);

let gui = new GUI();
gui.add(u.mixVal, "value", 0, 1).name("mixVal");

window.addEventListener("resize", onWindowResize);

renderer.setAnimationLoop(() => {
    renderer.render(scene, camera);
})

function onWindowResize() {

  camera.aspect = innerWidth / innerHeight;
  camera.updateProjectionMatrix();

  renderer.setSize(innerWidth, innerHeight);

}
</script>

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1