Help
RSS
API
Feed
Maltego
Contact
Domain > wap.tom94.com
×
More information on this domain is in
AlienVault OTX
Is this malicious?
Yes
No
DNS Resolutions
Date
IP Address
2018-11-11
13.94.63.122
(
ClassC
)
2024-10-23
135.125.203.98
(
ClassC
)
Port 80
HTTP/1.1 200 OKDate: Wed, 23 Oct 2024 09:42:37 GMTServer: Apache/2.4.59 (Debian)Vary: Accept-EncodingTransfer-Encoding: chunkedContent-Type: text/html; charsetUTF-8 !DOCTYPE html>html xmlnshttp://www.w3.org/1999/xhtml>head> meta charsetutf-8 /> meta http-equivx-ua-compatible contentieedge> meta nameviewport contentwidthdevice-width, initial-scale1.0 /> link relstylesheet typetext/css href/fonts/stylesheet.css charsetutf-8 /> link relstylesheet typetext/css href/css/fontawesome.min.css charsetutf-8 /> link relstylesheet typetext/css href/css/brands.min.css charsetutf-8 /> link relstylesheet typetext/css href/css/regular.min.css charsetutf-8 /> link relstylesheet typetext/css href/css/style.css?1669527573 mediaall /> title>Thomas Müller/title>/head>body> div classmainbody> div classheader-content> h1>a href/index.php>Thomas Müller/a>/h1> /div> div classline>/div> div classcontent> div classabout> a hrefmailto:hi@tom94.net>img classavatar srcimages/misc/self/self2.jpg>/a> h1>i classfar fa-portal-exit>/i> About me/h1> p>Hi there, Im Thomas! Im passionate about all kinds of nerdy things. To list a few: physics, AI, space travel, computers, music, electronics, and, of course, graphics! p stylemargin-top: 10px;>I am also principal research scientist at a stylecolor: #76B900; hrefhttps://research.nvidia.com/person/thomas-mueller>NVIDIA/a>./p> p stylemargin-top: 10px;> My PhD is from a hrefhttps://www.ethz.ch/en.html>ETH Zürich/a> / a hrefhttps://studios.disneyresearch.com/>span stylecolor: #000000;>Disney/span> span stylecolor: #AF3F39;>Research/span>/a>, where I had the pleasure to integrate my work into Disneys a hrefhttps://www.disneyanimation.com/technology/innovations/hyperion>Hyperion/a> renderer. I also created several open source projects, including the a hrefhttps://time.com/collection/best-inventions-2022/6225489/nvidia-instant-nerf/>award/a> a hrefhttps://blog.siggraph.org/2022/07/siggraph-2022-technical-papers-awards-best-papers-and-honorable-mentions.html/>winning/a> neural graphics- and 3D-reconstruction tool a hrefhttps://github.com/nvlabs/instant-ngp>instant-ngp/a>, the high-speed machine learning framework a hrefhttps://github.com/nvlabs/tiny-cuda-nn>tiny-cuda-nn/a>, and the image comparison tool a hrefhttps://github.com/Tom94/tev>tev/a>. In my free time, I enjoy working on a href/pages/projects/cellular>a/a> a href/pages/projects/femto>variety/a> of a hrefhttps://github.com/tom94/tev>pet/a> a href/images/misc/furnace.jpg>projects/a> and, in a past life, I was involved in developing the a hrefhttps://github.com/ppy/osu-performance>online/a> a hrefhttps://github.com/ppy/osu-framework>rhythm/a> a hrefhttps://github.com/ppy/osu>game/a> a stylecolor: #EE84B5; hrefhttps://osu.ppy.sh/home>osu!/a>. /p> p classresourcelist stylemargin-top: 10px;> a hrefmailto:hi@tom94.net>i classfar fa-envelope>/i> Email/a> a hrefhttps://github.com/tom94>i classfab fa-github>/i> GitHub/a> a hrefhttps://scholar.google.com/citations?hlen&usercsrnbFYAAAAJ>i classfar fa-graduation-cap>/i> Google Scholar/a> a hrefhttps://www.linkedin.com/in/thomas-müller-653a2a137/>i classfab fa-linkedin>/i> LinkedIn/a> !-- a hrefhttps://www.youtube.com/channel/UCvyV7-Astqf-X3ifb20DDUw>i classfab fa-youtube>/i> YouTube/a> --> a hrefhttps://www.linkedin.com/in/thomas-müller-653a2a137/details/honors/>i classfar fa-award>/i> Awards/a> !-- a href#sec-projects>i classfar fa-hat-wizard>/i> Selected Projects/a> --> /p>/div>h1 idsec-publications>i classfar fa-podium-star>/i> Publications/h1>div idportfolio classportfolio-item onmouseoverdocument.getElementById(wang23adaptive-video).play() onmouseoutdocument.getElementById(wang23adaptive-video).pause()> div classportfolio-image>a href/data/publications/wang23adaptive/wang23adaptive.pdf>video muted loop preloadauto posterimages/publications/wang23adaptive/768x432.jpg idwang23adaptive-video>source srcimages/publications/wang23adaptive/512x288-hevc.mp4 typevideo/mp4; codecshevc>source srcimages/publications/wang23adaptive/512x288.webm typevideo/webm; codecsvp9>source srcimages/publications/wang23adaptive/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/publications/wang23adaptive/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a href/data/publications/wang23adaptive/wang23adaptive.pdf>Adaptive Shells for Efficient Neural Radiance Field Rendering/a>/h2> p classauthorlist>a stylewhite-space: nowrap; hrefhttp://www.cs.toronto.edu/~zianwang/>Zian Wang*/a>, a stylewhite-space: nowrap; hrefhttps://www.cs.toronto.edu/~shenti11/>Frank Shen*/a>, a stylewhite-space: nowrap; hrefhttps://merlin.nimierdavid.fr/>Merlin Nimier-David*/a>, a stylewhite-space: nowrap; hrefhttps://nmwsharp.com>Nicholas Sharp/a>, a stylewhite-space: nowrap; hrefhttp://www.cs.toronto.edu/~jungao/>Jun Gao/a>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/alex-keller>Alexander Keller/a>, a stylewhite-space: nowrap; hrefhttps://www.cs.utoronto.ca/~fidler/>Sanja Fidler/a>, span stylewhite-space: nowrap;>Thomas Müller/span>, a stylewhite-space: nowrap; hrefhttps://zgojcic.github.io>Zan Gojcic/a>/p> p classvenue>a hrefhttps://dl.acm.org/doi/10.1145/3618390>ACM Transactions on Graphics (strong>SIGGRAPH Asia/strong>)/a>, Dec 2023/p> p classvenue>i classfar fa-award>/i> strong>SIGGRAPH Asia/strong> Best Paper Award/p> p classresourcelist>span stylewhite-space: nowrap;>a href/data/publications/wang23adaptive/wang23adaptive.pdf>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a href/data/publications/wang23adaptive/wang23adaptive.mp4>i classfar fa-video>/i> Video/a>/span> span stylewhite-space: nowrap;>a href/data/publications/wang23adaptive/wang23adaptive-supp.pdf>i classfar fa-file-alt>/i> Supplementary/a>/span> span stylewhite-space: nowrap;>a href/data/publications/wang23adaptive/wang23adaptive-supp.mp4>i classfar fa-video>/i> Supplementary/a>/span> span stylewhite-space: nowrap;>a hrefhttps://research.nvidia.com/labs/toronto-ai/adaptive-shells/>i classfar fa-external-link>/i> Project Page/a>/span> span stylewhite-space: nowrap;>a href/data/publications/wang23adaptive/wang23adaptive.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(takikawa23compact-video).play() onmouseoutdocument.getElementById(takikawa23compact-video).pause()> div classportfolio-image>a href/data/publications/takikawa23compact/takikawa23compact.pdf>video muted loop preloadauto posterimages/publications/takikawa23compact/768x432.jpg idtakikawa23compact-video>source srcimages/publications/takikawa23compact/512x288-hevc.mp4 typevideo/mp4; codecshevc>source srcimages/publications/takikawa23compact/512x288.webm typevideo/webm; codecsvp9>source srcimages/publications/takikawa23compact/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/publications/takikawa23compact/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a href/data/publications/takikawa23compact/takikawa23compact.pdf>Compact Neural Graphics Primitives with Learned Hash Probing/a>/h2> p classauthorlist>a stylewhite-space: nowrap; hrefhttps://tovacinni.github.io/>Towaki Takikawa/a>, span stylewhite-space: nowrap;>Thomas Müller/span>, a stylewhite-space: nowrap; hrefhttps://merlin.nimierdavid.fr/>Merlin Nimier-David/a>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/alex-evans>Alex Evans/a>, a stylewhite-space: nowrap; hrefhttps://www.cs.utoronto.ca/~fidler/>Sanja Fidler/a>, a stylewhite-space: nowrap; hrefhttps://www.cs.toronto.edu/~jacobson/>Alec Jacobson/a>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/alex-keller>Alexander Keller/a>/p> p classvenue>a hrefhttps://dl.acm.org/doi/10.1145/3610548.3618167>ACM strong>SIGGRAPH Asia/strong> Conference Proceedings/a>, Dec 2023/p> p classresourcelist>span stylewhite-space: nowrap;>a href/data/publications/takikawa23compact/takikawa23compact.pdf>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a href/data/publications/takikawa23compact/takikawa23compact.mp4>i classfar fa-video>/i> Video/a>/span> span stylewhite-space: nowrap;>a hrefhttps://research.nvidia.com/labs/toronto-ai/compact-ngp/>i classfar fa-external-link>/i> Project Page/a>/span> span stylewhite-space: nowrap;>a href/data/publications/takikawa23compact/takikawa23compact.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(nicolet23recursive-video).play() onmouseoutdocument.getElementById(nicolet23recursive-video).pause()> div classportfolio-image>a href/data/publications/nicolet23recursive/nicolet23recursive.pdf>video muted loop preloadauto posterimages/publications/nicolet23recursive/768x432.jpg idnicolet23recursive-video>source srcimages/publications/nicolet23recursive/512x288-hevc.mp4 typevideo/mp4; codecshevc>source srcimages/publications/nicolet23recursive/512x288.webm typevideo/webm; codecsvp9>source srcimages/publications/nicolet23recursive/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/publications/nicolet23recursive/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a href/data/publications/nicolet23recursive/nicolet23recursive.pdf>Recursive Control Variates for Inverse Rendering/a>/h2> p classauthorlist>a stylewhite-space: nowrap; hrefhttps://bnicolet.com>Baptiste Nicolet/a>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/fabrice-rousselle>Fabrice Rousselle/a>, a stylewhite-space: nowrap; hrefhttp://jannovak.info>Jan Novák/a>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/alex-keller>Alexander Keller/a>, a stylewhite-space: nowrap; hrefhttps://rgl.epfl.ch/people/wjakob/>Wenzel Jakob/a>, span stylewhite-space: nowrap;>Thomas Müller/span>/p> p classvenue>a hrefhttps://dl.acm.org/doi/10.1145/3592139>ACM Transactions on Graphics (strong>SIGGRAPH/strong>)/a>, Jul 2023/p> p classresourcelist>span stylewhite-space: nowrap;>a href/data/publications/nicolet23recursive/nicolet23recursive.pdf>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a hrefhttps://github.com/rgl-epfl/recursive_control_variates>i classfar fa-code>/i> Code/a>/span> span stylewhite-space: nowrap;>a href/data/publications/nicolet23recursive/interactive-viewer>i classfar fa-rocket-launch>/i> Interactive Results/a>/span> span stylewhite-space: nowrap;>a hrefhttps://rgl.epfl.ch/publications/Nicolet2023Recursive>i classfar fa-external-link>/i> Project Page/a>/span> span stylewhite-space: nowrap;>a href/data/publications/nicolet23recursive/nicolet23recursive.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(li23neuralangelo-video).play() onmouseoutdocument.getElementById(li23neuralangelo-video).pause()> div classportfolio-image>a href/data/publications/li23neuralangelo/li23neuralangelo.pdf>video muted loop preloadauto posterimages/publications/li23neuralangelo/768x432.jpg idli23neuralangelo-video>source srcimages/publications/li23neuralangelo/512x288-hevc.mp4 typevideo/mp4; codecshevc>source srcimages/publications/li23neuralangelo/512x288.webm typevideo/webm; codecsvp9>source srcimages/publications/li23neuralangelo/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/publications/li23neuralangelo/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a href/data/publications/li23neuralangelo/li23neuralangelo.pdf>Neuralangelo: High-Fidelity Neural Surface Reconstruction/a>/h2> p classauthorlist>a stylewhite-space: nowrap; hrefhttps://mli0603.github.io/>Max Zhaoshuo Li/a>, span stylewhite-space: nowrap;>Thomas Müller/span>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/alex-evans>Alex Evans/a>, a stylewhite-space: nowrap; hrefhttps://www.cs.jhu.edu/~rht/>Russell H. Taylor/a>, a stylewhite-space: nowrap; hrefhttps://mathiasunberath.github.io>Mathias Unberath/a>, a stylewhite-space: nowrap; hrefhttps://mingyuliu.net>Ming-Yu Liu/a>, a stylewhite-space: nowrap; hrefhttps://chenhsuanlin.bitbucket.io>Chen-Hsuan Lin/a>/p> p classvenue>strong>CVPR/strong>, Jun 2023/p> p classvenue>i classfar fa-award>/i> a hrefhttps://time.com/collection/best-inventions-2023/6324133/neuralangelo/>strong>TIME/strong> Best Inventions of 2023/a>/p> p classresourcelist>span stylewhite-space: nowrap;>a href/data/publications/li23neuralangelo/li23neuralangelo.pdf>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a hrefhttps://github.com/NVlabs/neuralangelo>i classfar fa-code>/i> Code/a>/span> span stylewhite-space: nowrap;>a href/data/publications/li23neuralangelo/li23neuralangelo.mp4>i classfar fa-video>/i> Video/a>/span> span stylewhite-space: nowrap;>a href/data/publications/li23neuralangelo/li23neuralangelo-supp.pdf>i classfar fa-file-alt>/i> Supplementary/a>/span> span stylewhite-space: nowrap;>a hrefhttps://research.nvidia.com/labs/dir/neuralangelo/>i classfar fa-external-link>/i> Project Page/a>/span> span stylewhite-space: nowrap;>a href/data/publications/li23neuralangelo/li23neuralangelo.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(wen23bundlesdf-video).play() onmouseoutdocument.getElementById(wen23bundlesdf-video).pause()> div classportfolio-image>a hrefhttps://arxiv.org/abs/2303.14158>video muted loop preloadauto posterimages/publications/wen23bundlesdf/768x432.jpg idwen23bundlesdf-video>source srcimages/publications/wen23bundlesdf/512x288-hevc.mp4 typevideo/mp4; codecshevc>source srcimages/publications/wen23bundlesdf/512x288.webm typevideo/webm; codecsvp9>source srcimages/publications/wen23bundlesdf/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/publications/wen23bundlesdf/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a hrefhttps://arxiv.org/abs/2303.14158>BundleSDF: Neural 6-DoF Tracking and 3D Reconstruction of Unknown Objects/a>/h2> p classauthorlist>a stylewhite-space: nowrap; hrefhttps://wenbowen123.github.io>Bowen Wen/a>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/jonathan-tremblay>Jonathan Tremblay/a>, a stylewhite-space: nowrap; hrefhttps://www.cs.cornell.edu/~valts/>Valts Blukis/a>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/stephen-tyree>Stephen Tyree/a>, span stylewhite-space: nowrap;>Thomas Müller/span>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/alex-evans>Alex Evans/a>, a stylewhite-space: nowrap; hrefhttps://homes.cs.washington.edu/~fox/>Dieter Fox/a>, a stylewhite-space: nowrap; hrefhttps://jankautz.com/>Jan Kautz/a>, a stylewhite-space: nowrap; hrefhttps://cecas.clemson.edu/~stb/>Stan Birchfield/a>/p> p classvenue>strong>CVPR/strong>, Jun 2023/p> p classresourcelist>span stylewhite-space: nowrap;>a hrefhttps://arxiv.org/abs/2303.14158>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a hrefhttps://github.com/NVlabs/BundleSDF>i classfar fa-code>/i> Code/a>/span> span stylewhite-space: nowrap;>a hrefhttps://www.youtube.com/watch?v5PymzKbKv8w>i classfar fa-video>/i> Video/a>/span> span stylewhite-space: nowrap;>a hrefhttps://bundlesdf.github.io>i classfar fa-external-link>/i> Project Page/a>/span> span stylewhite-space: nowrap;>a href/data/publications/wen23bundlesdf/wen23bundlesdf.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(lin22parallel-video).play() onmouseoutdocument.getElementById(lin22parallel-video).pause()> div classportfolio-image>a hrefhttps://arxiv.org/abs/2210.10108>video muted loop preloadauto posterimages/publications/lin22parallel/768x432.jpg idlin22parallel-video>source srcimages/publications/lin22parallel/512x288-hevc.mp4 typevideo/mp4; codecshevc>source srcimages/publications/lin22parallel/512x288.webm typevideo/webm; codecsvp9>source srcimages/publications/lin22parallel/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/publications/lin22parallel/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a hrefhttps://arxiv.org/abs/2210.10108>Parallel Inversion of Neural Radiance Fields for Robust Pose Estimation/a>/h2> p classauthorlist>a stylewhite-space: nowrap; hrefhttps://yunzhi.netlify.app>Yunzhi Lin/a>, span stylewhite-space: nowrap;>Thomas Müller/span>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/jonathan-tremblay>Jonathan Tremblay/a>, a stylewhite-space: nowrap; hrefhttps://wenbowen123.github.io>Bowen Wen/a>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/stephen-tyree>Stephen Tyree/a>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/alex-evans>Alex Evans/a>, a stylewhite-space: nowrap; hrefhttps://pvela.gatech.edu>Patricio A. Vela/a>, a stylewhite-space: nowrap; hrefhttps://cecas.clemson.edu/~stb/>Stan Birchfield/a>/p> p classvenue>strong>ICRA/strong>, May 2023/p> p classresourcelist>span stylewhite-space: nowrap;>a hrefhttps://arxiv.org/abs/2210.10108>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a hrefhttps://www.youtube.com/watch?vQNsiPk6zqmU>i classfar fa-video>/i> Video/a>/span> span stylewhite-space: nowrap;>a hrefhttps://pnerfp.github.io>i classfar fa-external-link>/i> Project Page/a>/span> span stylewhite-space: nowrap;>a href/data/publications/lin22parallel/lin22parallel.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(tremblay22rtmv-video).play() onmouseoutdocument.getElementById(tremblay22rtmv-video).pause()> div classportfolio-image>a hrefhttps://arxiv.org/abs/2205.07058>video muted loop preloadauto posterimages/publications/tremblay22rtmv/768x432.jpg idtremblay22rtmv-video>source srcimages/publications/tremblay22rtmv/512x288-hevc.mp4 typevideo/mp4; codecshevc>source srcimages/publications/tremblay22rtmv/512x288.webm typevideo/webm; codecsvp9>source srcimages/publications/tremblay22rtmv/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/publications/tremblay22rtmv/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a hrefhttps://arxiv.org/abs/2205.07058>RTMV: A Ray-Traced Multi-View Synthetic Dataset for Novel View Synthesis/a>/h2> p classauthorlist>a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/jonathan-tremblay>Jonathan Tremblay*/a>, a stylewhite-space: nowrap; hrefhttp://www.cs.umd.edu/~mmeshry/>Moustafa Meshry*/a>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/alex-evans>Alex Evans/a>, a stylewhite-space: nowrap; hrefhttps://jankautz.com/>Jan Kautz/a>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/alex-keller>Alexander Keller/a>, a stylewhite-space: nowrap; hrefhttps://www.samehkhamis.com/>Sameh Khamis/a>, a stylewhite-space: nowrap; hrefhttp://charlesloop.com/>Charles Loop/a>, a stylewhite-space: nowrap; hrefhttp://www.cs.utah.edu/~natevm/>Nathan Morrical/a>, span stylewhite-space: nowrap;>Thomas Müller/span>, a stylewhite-space: nowrap; hrefhttps://luminohope.org/>Koki Nagano/a>, a stylewhite-space: nowrap; hrefhttps://tovacinni.github.io/>Towaki Takikawa/a>, a stylewhite-space: nowrap; hrefhttps://cecas.clemson.edu/~stb/>Stan Birchfield/a>/p> p classvenue>a hrefhttps://learn3dg.github.io/#accepted>Learning to Generate 3D Shapes and Scenes (strong>ECCV/strong> Workshop)/a>, Oct 2022/p> p classresourcelist>span stylewhite-space: nowrap;>a hrefhttps://arxiv.org/abs/2205.07058>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a hrefhttps://drive.google.com/drive/folders/1cUXxUp6g25WwzHnm_491zNJJ4T7R_fum>i classfar fa-database>/i> Dataset/a>/span> span stylewhite-space: nowrap;>a hrefhttps://www.youtube.com/watch?vX82ZZROEAMM>i classfar fa-video>/i> Video/a>/span> span stylewhite-space: nowrap;>a hrefhttp://www.cs.umd.edu/~mmeshry/projects/rtmv/>i classfar fa-external-link>/i> Project Page/a>/span> span stylewhite-space: nowrap;>a href/data/publications/tremblay22rtmv/tremblay22rtmv.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(mueller22instant-video).play() onmouseoutdocument.getElementById(mueller22instant-video).pause()> div classportfolio-image>a href/data/publications/mueller22instant/mueller22instant.pdf>video muted loop preloadauto posterimages/publications/mueller22instant/768x432.jpg idmueller22instant-video>source srcimages/publications/mueller22instant/512x288-hevc.mp4 typevideo/mp4; codecshevc>source srcimages/publications/mueller22instant/512x288.webm typevideo/webm; codecsvp9>source srcimages/publications/mueller22instant/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/publications/mueller22instant/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a href/data/publications/mueller22instant/mueller22instant.pdf>Instant Neural Graphics Primitives with a Multiresolution Hash Encoding/a>/h2> p classauthorlist>span stylewhite-space: nowrap;>Thomas Müller/span>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/alex-evans>Alex Evans/a>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/christoph-schied>Christoph Schied/a>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/alex-keller>Alexander Keller/a>/p> p classvenue>a hrefhttps://dl.acm.org/doi/abs/10.1145/3528223.3530127>ACM Transactions on Graphics (strong>SIGGRAPH/strong>)/a>, Jul 2022/p> p classvenue>i classfar fa-award>/i> a hrefhttps://blog.siggraph.org/2022/07/siggraph-2022-technical-papers-awards-best-papers-and-honorable-mentions.html/>strong>SIGGRAPH/strong> Best Paper Award/a>br/>i classfar fa-award>/i> a hrefhttps://time.com/collection/best-inventions-2022/6225489/nvidia-instant-nerf/>strong>TIME/strong> Best Inventions of 2022/a>/p> p classresourcelist>span stylewhite-space: nowrap;>a href/data/publications/mueller22instant/mueller22instant.pdf>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a hrefhttps://github.com/NVlabs/instant-ngp>i classfar fa-code>/i> Code/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller22instant/mueller22instant.mp4>i classfar fa-video>/i> Video/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller22instant/mueller22instant-gtc.mp4>i classfar fa-video>/i> Presentation/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller22instant/mueller22instant-rtl.mp4>i classfar fa-video>/i> Real-Time Live/a>/span> span stylewhite-space: nowrap;>a hrefhttps://nvlabs.github.io/instant-ngp>i classfar fa-external-link>/i> Project Page/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller22instant/mueller22instant.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(nimierdavid22unbiased-video).play() onmouseoutdocument.getElementById(nimierdavid22unbiased-video).pause()> div classportfolio-image>a href/data/publications/nimierdavid22unbiased/nimierdavid22unbiased.pdf>img srcimages/publications/nimierdavid22unbiased/768x432.jpg>/a>/div>div classportfolio-text> h2>a href/data/publications/nimierdavid22unbiased/nimierdavid22unbiased.pdf>Unbiased Inverse Volume Rendering with Differential Trackers/a>/h2> p classauthorlist>a stylewhite-space: nowrap; hrefhttps://merlin.nimierdavid.fr/>Merlin Nimier-David/a>, span stylewhite-space: nowrap;>Thomas Müller/span>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/alex-keller>Alexander Keller/a>, a stylewhite-space: nowrap; hrefhttps://rgl.epfl.ch/people/wjakob/>Wenzel Jakob/a>/p> p classvenue>a hrefhttps://dl.acm.org/doi/abs/>ACM Transactions on Graphics (strong>SIGGRAPH/strong>)/a>, Jul 2022/p> p classresourcelist>span stylewhite-space: nowrap;>a href/data/publications/nimierdavid22unbiased/nimierdavid22unbiased.pdf>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a hrefhttps://github.com/rgl-epfl/unbiased-inverse-volume-rendering>i classfar fa-code>/i> Code/a>/span> span stylewhite-space: nowrap;>a href/data/publications/nimierdavid22unbiased/interactive-viewer/>i classfar fa-rocket-launch>/i> Interactive Results/a>/span> span stylewhite-space: nowrap;>a href/data/publications/nimierdavid22unbiased/nimierdavid22unbiased.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(takikawa22variable-video).play() onmouseoutdocument.getElementById(takikawa22variable-video).pause()> div classportfolio-image>a href/data/publications/takikawa22variable/takikawa22variable.pdf>video muted loop preloadauto posterimages/publications/takikawa22variable/768x432.jpg idtakikawa22variable-video>source srcimages/publications/takikawa22variable/512x288-hevc.mp4 typevideo/mp4; codecshevc>source srcimages/publications/takikawa22variable/512x288.webm typevideo/webm; codecsvp9>source srcimages/publications/takikawa22variable/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/publications/takikawa22variable/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a href/data/publications/takikawa22variable/takikawa22variable.pdf>Variable Bitrate Neural Fields/a>/h2> p classauthorlist>a stylewhite-space: nowrap; hrefhttps://tovacinni.github.io/>Towaki Takikawa/a>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/alex-evans>Alex Evans/a>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/jonathan-tremblay>Jonathan Tremblay/a>, span stylewhite-space: nowrap;>Thomas Müller/span>, a stylewhite-space: nowrap; hrefhttps://www.cs.williams.edu/~morgan/>Morgan McGuire/a>, a stylewhite-space: nowrap; hrefhttps://www.cs.toronto.edu/~jacobson/>Alec Jacobson/a>, a stylewhite-space: nowrap; hrefhttps://www.cs.utoronto.ca/~fidler/>Sanja Fidler/a>/p> p classvenue>a hrefhttps://dl.acm.org/doi/10.1145/3528233.3530727>ACM strong>SIGGRAPH/strong> Conference Proceedings/a>, Jul 2022/p> p classresourcelist>span stylewhite-space: nowrap;>a href/data/publications/takikawa22variable/takikawa22variable.pdf>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a hrefhttps://github.com/nv-tlabs/vqad>i classfar fa-code>/i> Code/a>/span> span stylewhite-space: nowrap;>a href/data/publications/takikawa22variable/takikawa22variable.mp4>i classfar fa-video>/i> Video/a>/span> span stylewhite-space: nowrap;>a hrefhttps://nv-tlabs.github.io/vqad>i classfar fa-external-link>/i> Project Page/a>/span> span stylewhite-space: nowrap;>a href/data/publications/takikawa22variable/takikawa22variable.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(munkberg21extracting-video).play() onmouseoutdocument.getElementById(munkberg21extracting-video).pause()> div classportfolio-image>a href/data/publications/munkberg21extracting/munkberg21extracting.pdf>video muted loop preloadauto posterimages/publications/munkberg21extracting/768x432.jpg idmunkberg21extracting-video>source srcimages/publications/munkberg21extracting/512x288-hevc.mp4 typevideo/mp4; codecshevc>source srcimages/publications/munkberg21extracting/512x288.webm typevideo/webm; codecsvp9>source srcimages/publications/munkberg21extracting/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/publications/munkberg21extracting/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a href/data/publications/munkberg21extracting/munkberg21extracting.pdf>Extracting Triangular 3D Models, Materials, and Lighting From Images/a>/h2> p classauthorlist>a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/jacob-munkberg>Jacob Munkberg/a>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/jon-hasselgren>Jon Hasselgren/a>, a stylewhite-space: nowrap; hrefhttp://www.cs.toronto.edu/~shenti11/>Tianchang Shen/a>, a stylewhite-space: nowrap; hrefhttp://www.cs.toronto.edu/~wenzheng/>Wenzheng Chen/a>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/alex-evans>Alex Evans/a>, span stylewhite-space: nowrap;>Thomas Müller/span>, a stylewhite-space: nowrap; hrefhttps://www.cs.utoronto.ca/~fidler/>Sanja Fidler/a>/p> p classvenue>strong>CVPR/strong>, Jun 2022/p> p classresourcelist>span stylewhite-space: nowrap;>a href/data/publications/munkberg21extracting/munkberg21extracting.pdf>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a hrefhttps://github.com/NVlabs/nvdiffrec>i classfar fa-code>/i> Code/a>/span> span stylewhite-space: nowrap;>a href/data/publications/munkberg21extracting/munkberg21extracting.mp4>i classfar fa-video>/i> Video/a>/span> span stylewhite-space: nowrap;>a hrefhttps://nvlabs.github.io/nvdiffrec/>i classfar fa-external-link>/i> Project Page/a>/span> span stylewhite-space: nowrap;>a href/data/publications/munkberg21extracting/munkberg21extracting.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(majercik21dynamic-video).play() onmouseoutdocument.getElementById(majercik21dynamic-video).pause()> div classportfolio-image>a href/data/publications/majercik21dynamic/majercik21dynamic.pdf>video muted loop preloadauto posterimages/publications/majercik21dynamic/768x432.jpg idmajercik21dynamic-video>source srcimages/publications/majercik21dynamic/512x288-hevc.mp4 typevideo/mp4; codecshevc>source srcimages/publications/majercik21dynamic/512x288.webm typevideo/webm; codecsvp9>source srcimages/publications/majercik21dynamic/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/publications/majercik21dynamic/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a href/data/publications/majercik21dynamic/majercik21dynamic.pdf>Dynamic Diffuse Global Illumination Resampling/a>/h2> p classauthorlist>a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/zander-majercik>Zander Majercik/a>, span stylewhite-space: nowrap;>Thomas Müller/span>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/alex-keller>Alexander Keller/a>, a stylewhite-space: nowrap; hrefhttp://www.cim.mcgill.ca/~derek/>Derek Nowrouzezahrai/a>, a stylewhite-space: nowrap; hrefhttps://www.cs.williams.edu/~morgan/>Morgan McGuire/a>/p> p classvenue>a hrefhttps://onlinelibrary.wiley.com/doi/10.1111/cgf.14427>Computer Graphics Forum/a>, Dec 2021/p> p classresourcelist>span stylewhite-space: nowrap;>a href/data/publications/majercik21dynamic/majercik21dynamic.pdf>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a href/data/publications/majercik21dynamic/majercik21dynamic.mp4>i classfar fa-video>/i> Video/a>/span> span stylewhite-space: nowrap;>a href/data/publications/majercik21dynamic/majercik21dynamic.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(dodik21pathguiding-video).play() onmouseoutdocument.getElementById(dodik21pathguiding-video).pause()> div classportfolio-image>a hrefhttps://anadodik.github.io/publication/sdmm-paper/sdmm-paper.pdf>img srcimages/publications/dodik21pathguiding/768x432.jpg>/a>/div>div classportfolio-text> h2>a hrefhttps://anadodik.github.io/publication/sdmm-paper/sdmm-paper.pdf>Path Guiding Using Spatio-Directional Mixture Models/a>/h2> p classauthorlist>a stylewhite-space: nowrap; hrefhttps://anadodik.github.io>Ana Dodik/a>, a stylewhite-space: nowrap; hrefhttps://studios.disneyresearch.com/people/marios-papas/>Marios Papas/a>, a stylewhite-space: nowrap; hrefhttps://www.cl.cam.ac.uk/~aco41/>Cengiz Öztireli/a>, span stylewhite-space: nowrap;>Thomas Müller/span>/p> p classvenue>a hrefhttps://onlinelibrary.wiley.com/doi/10.1111/cgf.14428>Computer Graphics Forum/a>, Dec 2021/p> p classresourcelist>span stylewhite-space: nowrap;>a hrefhttps://anadodik.github.io/publication/sdmm-paper/sdmm-paper.pdf>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a hrefhttps://anadodik.github.io/publication/sdmm-paper/sdmm-paper-suppl.pdf>i classfar fa-file-alt>/i> Supplementary/a>/span> span stylewhite-space: nowrap;>a hrefhttps://github.com/anadodik/sdmm-mitsuba>i classfar fa-code>/i> Code/a>/span> span stylewhite-space: nowrap;>a hrefhttps://anadodik.github.io/publication/sdmm-paper/sdmm-video.mp4>i classfar fa-video>/i> Video/a>/span> span stylewhite-space: nowrap;>a hrefhttps://anadodik.github.io/publication/sdmm-paper/website/index.html>i classfar fa-rocket-launch>/i> Interactive Results/a>/span> span stylewhite-space: nowrap;>a href/data/publications/dodik21pathguiding/dodik21pathguiding.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(mueller21realtime-video).play() onmouseoutdocument.getElementById(mueller21realtime-video).pause()> div classportfolio-image>a href/data/publications/mueller21realtime/mueller21realtime.pdf>video muted loop preloadauto posterimages/publications/mueller21realtime/768x432.jpg idmueller21realtime-video>source srcimages/publications/mueller21realtime/512x288-hevc.mp4 typevideo/mp4; codecshevc>source srcimages/publications/mueller21realtime/512x288.webm typevideo/webm; codecsvp9>source srcimages/publications/mueller21realtime/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/publications/mueller21realtime/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a href/data/publications/mueller21realtime/mueller21realtime.pdf>Real-time Neural Radiance Caching for Path Tracing/a>/h2> p classauthorlist>span stylewhite-space: nowrap;>Thomas Müller/span>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/fabrice-rousselle>Fabrice Rousselle/a>, a stylewhite-space: nowrap; hrefhttp://jannovak.info>Jan Novák/a>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/alex-keller>Alexander Keller/a>/p> p classvenue>a hrefhttps://dl.acm.org/doi/abs/10.1145/3450626.3459812>ACM Transactions on Graphics (strong>SIGGRAPH/strong>)/a>, Aug 2021/p> p classresourcelist>span stylewhite-space: nowrap;>a href/data/publications/mueller21realtime/mueller21realtime.pdf>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a hrefhttps://github.com/nvlabs/tiny-cuda-nn>i classfar fa-code>/i> Code/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller21realtime/mueller21realtime.mp4>i classfar fa-video>/i> Video/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller21realtime/mueller21realtime-gtc.mp4>i classfar fa-video>/i> Presentation/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller21realtime/interactive-viewer/>i classfar fa-rocket-launch>/i> Interactive Results/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller21realtime/mueller21realtime.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(mueller20neural-video).play() onmouseoutdocument.getElementById(mueller20neural-video).pause()> div classportfolio-image>a href/data/publications/mueller20neural/mueller20neural.pdf>video muted loop preloadauto posterimages/publications/mueller20neural/768x432.jpg idmueller20neural-video>source srcimages/publications/mueller20neural/512x288-hevc.mp4 typevideo/mp4; codecshevc>source srcimages/publications/mueller20neural/512x288.webm typevideo/webm; codecsvp9>source srcimages/publications/mueller20neural/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/publications/mueller20neural/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a href/data/publications/mueller20neural/mueller20neural.pdf>Neural Control Variates/a>/h2> p classauthorlist>span stylewhite-space: nowrap;>Thomas Müller/span>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/fabrice-rousselle>Fabrice Rousselle/a>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/alex-keller>Alexander Keller/a>, a stylewhite-space: nowrap; hrefhttp://jannovak.info>Jan Novák/a>/p> p classvenue>a hrefhttps://dl.acm.org/doi/abs/10.1145/3414685.3417804>ACM Transactions on Graphics (strong>SIGGRAPH Asia/strong>)/a>, Nov 2020/p> p classresourcelist>span stylewhite-space: nowrap;>a href/data/publications/mueller20neural/mueller20neural.pdf>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller20neural/mueller20neural.mp4>i classfar fa-video>/i> Video/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller20neural/mueller20neural-sigasia.mp4>i classfar fa-video>/i> SIGGRAPH Presentation/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller20neural/variance-reduction-using-neural-networks.mp4>i classfar fa-video>/i> MCQMC Presentation/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller20neural/interactive-viewer/>i classfar fa-rocket-launch>/i> Interactive Results/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller20neural/mueller20neural.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(hart20practical-video).play() onmouseoutdocument.getElementById(hart20practical-video).pause()> div classportfolio-image>a href/data/publications/hart20practical/hart20practical.pdf>video muted loop preloadauto posterimages/publications/hart20practical/768x432.jpg idhart20practical-video>source srcimages/publications/hart20practical/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/publications/hart20practical/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a href/data/publications/hart20practical/hart20practical.pdf>Practical Product Sampling by Fitting and Composing Warps/a>/h2> p classauthorlist>a stylewhite-space: nowrap; hrefhttps://www.linkedin.com/in/thedavidhart/>David Hart/a>, a stylewhite-space: nowrap; hrefhttps://pharr.org/matt/>Matt Pharr/a>, span stylewhite-space: nowrap;>Thomas Müller/span>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/ward-lopes>Ward Lopes/a>, a stylewhite-space: nowrap; hrefhttps://www.cs.williams.edu/~morgan/>Morgan McGuire/a>, a stylewhite-space: nowrap; hrefhttp://psgraphics.blogspot.com/p/blog-page.html>Peter Shirley/a>/p> p classvenue>a hrefhttps://onlinelibrary.wiley.com/doi/abs/10.1111/cgf.14060>Computer Graphics Forum (strong>EGSR/strong>)/a>, Jul 2020/p> p classresourcelist>span stylewhite-space: nowrap;>a href/data/publications/hart20practical/hart20practical.pdf>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a hrefhttps://www.shadertoy.com/view/wljyDz>i classfar fa-code>/i> Code (Shadertoy)/a>/span> span stylewhite-space: nowrap;>a hrefhttps://github.com/mmp/pbrt-v3/tree/warp-product-sampling>i classfar fa-code>/i> Code (PBRT)/a>/span> span stylewhite-space: nowrap;>a href/data/publications/hart20practical/hart20practical.mp4>i classfar fa-video>/i> Presentation/a>/span> span stylewhite-space: nowrap;>a href/data/publications/hart20practical/hart20practical.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(vorba19guiding-video).play() onmouseoutdocument.getElementById(vorba19guiding-video).pause()> div classportfolio-image>a href/data/courses/vorba19guiding/vorba19guiding-chapter10.pdf>video muted loop preloadauto posterimages/courses/vorba19guiding/768x432.jpg idvorba19guiding-video>source srcimages/courses/vorba19guiding/512x288-hevc.mp4 typevideo/mp4; codecshevc>source srcimages/courses/vorba19guiding/512x288.webm typevideo/webm; codecsvp9>source srcimages/courses/vorba19guiding/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/courses/vorba19guiding/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a href/data/courses/vorba19guiding/vorba19guiding-chapter10.pdf>Practical Path Guiding in Production/a>/h2> p classauthorlist>span stylewhite-space: nowrap;>Thomas Müller/span>/p> p classvenue>Chapter 10 of a hrefhttps://dl.acm.org/doi/10.1145/3305366.3328091>Path Guiding in Production in ACM strong>SIGGRAPH/strong> Courses/a>, 2019/p> p classresourcelist>span stylewhite-space: nowrap;>a href/data/courses/vorba19guiding/vorba19guiding-chapter10.pdf>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a href/data/courses/vorba19guiding/vorba19guiding.pdf>i classfar fa-file-alt>/i> All Course Notes/a>/span> span stylewhite-space: nowrap;>a hrefhttps://github.com/Tom94/practical-path-guiding>i classfar fa-code>/i> Code/a>/span> span stylewhite-space: nowrap;>a href/data/courses/vorba19guiding/vorba19guiding-slides.key>i classfar fa-keynote>/i> Slides/a>/span> span stylewhite-space: nowrap;>a href/data/courses/vorba19guiding/vorba19guiding-slides.pdf>i classfar fa-file-pdf>/i> Slides/a>/span> span stylewhite-space: nowrap;>a href/data/courses/vorba19guiding/vorba19guiding.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(mueller18neural-video).play() onmouseoutdocument.getElementById(mueller18neural-video).pause()> div classportfolio-image>a href/data/publications/mueller18neural/mueller18neural-v4.pdf>video muted loop preloadauto posterimages/publications/mueller18neural/768x432.jpg idmueller18neural-video>source srcimages/publications/mueller18neural/512x288-hevc.mp4 typevideo/mp4; codecshevc>source srcimages/publications/mueller18neural/512x288.webm typevideo/webm; codecsvp9>source srcimages/publications/mueller18neural/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/publications/mueller18neural/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a href/data/publications/mueller18neural/mueller18neural-v4.pdf>Neural Importance Sampling/a>/h2> p classauthorlist>span stylewhite-space: nowrap;>Thomas Müller/span>, a stylewhite-space: nowrap; hrefhttps://www.inf.ethz.ch/personal/mcbrian/>Brian McWilliams/a>, a stylewhite-space: nowrap; hrefhttps://research.nvidia.com/person/fabrice-rousselle>Fabrice Rousselle/a>, a stylewhite-space: nowrap; hrefhttps://graphics.ethz.ch/people/grossm/>Markus Gross/a>, a stylewhite-space: nowrap; hrefhttp://jannovak.info>Jan Novák/a>/p> p classvenue>a hrefhttps://dl.acm.org/doi/10.1145/3341156>ACM Transactions on Graphics (Presented at strong>SIGGRAPH/strong>)/a>, Oct 2019/p> p classresourcelist>span stylewhite-space: nowrap;>a href/data/publications/mueller18neural/mueller18neural-v4.pdf>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller18neural/mueller18neural.mp4>i classfar fa-video>/i> Video/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller18neural/mueller18neural-presentation.mp4>i classfar fa-video>/i> Presentation/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller18neural/mueller18neural-slides.key>i classfar fa-keynote>/i> Slides/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller18neural/mueller18neural-slides.pdf>i classfar fa-file-pdf>/i> Slides/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller18neural/interactive-viewer/>i classfar fa-rocket-launch>/i> Interactive Results/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller18neural/mueller18neural.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(bitterli18framework-video).play() onmouseoutdocument.getElementById(bitterli18framework-video).pause()> div classportfolio-image>a href/data/publications/bitterli18framework/bitterli18framework.pdf>video muted loop preloadauto posterimages/publications/bitterli18framework/768x432.jpg idbitterli18framework-video>source srcimages/publications/bitterli18framework/512x288-hevc.mp4 typevideo/mp4; codecshevc>source srcimages/publications/bitterli18framework/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/publications/bitterli18framework/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a href/data/publications/bitterli18framework/bitterli18framework.pdf>A radiative transfer framework for non-exponential media/a>/h2> p classauthorlist>a stylewhite-space: nowrap; hrefhttp://benedikt-bitterli.me>Benedikt Bitterli/a>, a stylewhite-space: nowrap; hrefhttps://www.cs.dartmouth.edu/~sriravic/>Srinath Ravichandran/a>, span stylewhite-space: nowrap;>Thomas Müller/span>, a stylewhite-space: nowrap; hrefhttp://magnuswrenninge.com>Magnus Wrenninge/a>, a stylewhite-space: nowrap; hrefhttp://jannovak.info>Jan Novák/a>, a stylewhite-space: nowrap; hrefhttp://www.cs.cornell.edu/~srm/>Steve Marschner/a>, a stylewhite-space: nowrap; hrefhttps://www.cs.dartmouth.edu/~wjarosz/>Wojciech Jarosz/a>/p> p classvenue>a hrefhttps://dl.acm.org/doi/10.1145/3272127.3275103>ACM Transactions on Graphics (strong>SIGGRAPH Asia/strong>)/a>, Nov 2018/p> p classresourcelist>span stylewhite-space: nowrap;>a href/data/publications/bitterli18framework/bitterli18framework.pdf>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a href/data/publications/bitterli18framework/bitterli18framework-supp.pdf>i classfar fa-file-alt>/i> Supplementary/a>/span> span stylewhite-space: nowrap;>a href/data/publications/bitterli18framework/bitterli18framework.mp4>i classfar fa-video>/i> Video/a>/span> span stylewhite-space: nowrap;>a hrefhttps://cs.dartmouth.edu/wjarosz/publications/bitterli18framework-supplemental/index.html>i classfar fa-rocket-launch>/i> Interactive Results/a>/span> span stylewhite-space: nowrap;>a hrefhttps://cs.dartmouth.edu/wjarosz/publications/bitterli18framework-supplemental.zip>i classfar fa-database>/i> Supplementary Data/a>/span> span stylewhite-space: nowrap;>a href/data/publications/bitterli18framework/bitterli18framework.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(kallweit17deep-video).play() onmouseoutdocument.getElementById(kallweit17deep-video).pause()> div classportfolio-image>a href/data/publications/kallweit17deep/kallweit17deep.pdf>video muted loop preloadauto posterimages/publications/kallweit17deep/768x432.jpg idkallweit17deep-video>source srcimages/publications/kallweit17deep/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/publications/kallweit17deep/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a href/data/publications/kallweit17deep/kallweit17deep.pdf>Deep Scattering: Rendering Atmospheric Clouds with Radiance-Predicting Neural Networks/a>/h2> p classauthorlist>a stylewhite-space: nowrap; hrefhttp://simon-kallweit.me>Simon Kallweit/a>, span stylewhite-space: nowrap;>Thomas Müller/span>, a stylewhite-space: nowrap; hrefhttps://www.inf.ethz.ch/personal/mcbrian/>Brian McWilliams/a>, a stylewhite-space: nowrap; hrefhttps://graphics.ethz.ch/people/grossm/>Markus Gross/a>, a stylewhite-space: nowrap; hrefhttp://jannovak.info>Jan Novák/a>/p> p classvenue>a hrefhttps://dl.acm.org/doi/10.1145/3130800.3130880>ACM Transactions on Graphics (strong>SIGGRAPH Asia/strong>)/a>, Nov 2017/p> p classresourcelist>span stylewhite-space: nowrap;>a href/data/publications/kallweit17deep/kallweit17deep.pdf>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a href/data/publications/kallweit17deep/kallweit17deep-supp.pdf>i classfar fa-file-alt>/i> Supplementary/a>/span> span stylewhite-space: nowrap;>a href/data/publications/kallweit17deep/kallweit17deep.mp4>i classfar fa-video>/i> Video/a>/span> span stylewhite-space: nowrap;>a href/data/publications/kallweit17deep/interactive-viewer/>i classfar fa-rocket-launch>/i> Interactive Results/a>/span> span stylewhite-space: nowrap;>a hrefhttps://studios.disneyresearch.com/2017/11/20/deep-scattering-rendering-atmospheric-clouds-with-radiance-predicting-neural-networks/>i classfar fa-database>/i> Cloud Database/a>/span> span stylewhite-space: nowrap;>a href/data/publications/kallweit17deep/kallweit17deep.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(mueller17practical-video).play() onmouseoutdocument.getElementById(mueller17practical-video).pause()> div classportfolio-image>a href/pages/publications/mueller17practical-erratum>video muted loop preloadauto posterimages/publications/mueller17practical/768x432.jpg idmueller17practical-video>source srcimages/publications/mueller17practical/512x288.webm typevideo/webm; codecsvp9>source srcimages/publications/mueller17practical/512x288-hevc.mp4 typevideo/mp4; codecshevc>source srcimages/publications/mueller17practical/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/publications/mueller17practical/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a href/pages/publications/mueller17practical-erratum>Practical Path Guiding for Efficient Light-Transport Simulation/a>/h2> p classauthorlist>span stylewhite-space: nowrap;>Thomas Müller/span>, a stylewhite-space: nowrap; hrefhttps://graphics.ethz.ch/people/grossm/>Markus Gross/a>, a stylewhite-space: nowrap; hrefhttp://jannovak.info>Jan Novák/a>/p> p classvenue>a hrefhttps://onlinelibrary.wiley.com/doi/abs/10.1111/cgf.13227>Computer Graphics Forum (strong>EGSR/strong>)/a>, Jun 2017/p> p classvenue>i classfar fa-award>/i> strong>EGSR/strong> Best Paper Award/p> p classresourcelist>span stylewhite-space: nowrap;>a href/pages/publications/mueller17practical-erratum>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller17practical/mueller17practical-supp.pdf>i classfar fa-file-alt>/i> Supplementary/a>/span> span stylewhite-space: nowrap;>a hrefhttps://github.com/Tom94/practical-path-guiding>i classfar fa-code>/i> Code/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller17practical/mueller17practical.mp4>i classfar fa-video>/i> Video/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller17practical/mueller17practical.pptx>i classfar fa-file-powerpoint>/i> Slides/a>/span> span stylewhite-space: nowrap;>a href/pages/publications/mueller17practical-erratum>i classfar fa-rocket-launch>/i> Interactive Results/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller17practical/mueller17practical.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(mueller16efficient-video).play() onmouseoutdocument.getElementById(mueller16efficient-video).pause()> div classportfolio-image>a href/data/publications/mueller16efficient/mueller16efficient.pdf>video muted loop preloadauto posterimages/publications/mueller16efficient/768x432.jpg idmueller16efficient-video>source srcimages/publications/mueller16efficient/512x288-hevc.mp4 typevideo/mp4; codecshevc>source srcimages/publications/mueller16efficient/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/publications/mueller16efficient/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a href/data/publications/mueller16efficient/mueller16efficient.pdf>Efficient Rendering of Heterogeneous Polydisperse Granular Media/a>/h2> p classauthorlist>span stylewhite-space: nowrap;>Thomas Müller/span>, a stylewhite-space: nowrap; hrefhttps://studios.disneyresearch.com/people/marios-papas/>Marios Papas/a>, a stylewhite-space: nowrap; hrefhttps://graphics.ethz.ch/people/grossm/>Markus Gross/a>, a stylewhite-space: nowrap; hrefhttps://www.cs.dartmouth.edu/~wjarosz/>Wojciech Jarosz/a>, a stylewhite-space: nowrap; hrefhttp://jannovak.info>Jan Novák/a>/p> p classvenue>a hrefhttps://dl.acm.org/doi/10.1145/2980179.2982429>ACM Transactions on Graphics (strong>SIGGRAPH Asia/strong>)/a>, Nov 2016/p> p classvenue>i classfar fa-award>/i> a hrefhttp://vcg.isti.cnr.it/cgf/winner.php>Won the strong>CGF Cover Contest 2017/strong>/a>/p> p classresourcelist>span stylewhite-space: nowrap;>a href/data/publications/mueller16efficient/mueller16efficient.pdf>i classfar fa-file-alt>/i> Paper/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller16efficient/mueller16efficient-supp.pdf>i classfar fa-file-alt>/i> Supplementary/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller16efficient/mueller16efficient.mp4>i classfar fa-video>/i> Video/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller16efficient/interactive-viewer/>i classfar fa-rocket-launch>/i> Interactive Results/a>/span> span stylewhite-space: nowrap;>a hrefhttps://studios.disneyresearch.com/2016/11/11/efficient-rendering-of-heterogeneous-poly-disperse-granular-media/>i classfar fa-database>/i> Shell Database/a>/span> span stylewhite-space: nowrap;>a href/data/publications/mueller16efficient/mueller16efficient.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>h1 idsec-projects>i classfar fa-hat-wizard>/i> Selected Personal Projects/h1>div idportfolio classportfolio-item onmouseoverdocument.getElementById(femto-video).play() onmouseoutdocument.getElementById(femto-video).pause()> div classportfolio-image>a href/pages/projects/femto>video muted loop preloadauto posterimages/projects/femto/768x432.jpg idfemto-video>source srcimages/projects/femto/512x288-hevc.mp4 typevideo/mp4; codecshevc>source srcimages/projects/femto/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/projects/femto/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a href/pages/projects/femto>Real-Time Transient Rendering/a>/h2> p classvenue>p styletext-align: justify;>WebGL brute-force path tracer for visualizing the propagation of light through space. This toy project was inspired by a hrefhttp://femtocamera.info>femto photography/a> and offline transient rendering a hrefhttps://dl.acm.org/citation.cfm?id2661251>Jarabo et al. 2014/a>, a hrefhttps://benedikt-bitterli.me/femto.html>Bitterli 2016/a>./p>/p> p classresourcelist>span stylewhite-space: nowrap;>a href/pages/projects/femto>i classfar fa-rocket-launch>/i> Interactive Demo/a>/span> span stylewhite-space: nowrap;>a hrefhttps://www.shadertoy.com/view/Mt33Wn>i classfar fa-code>/i> Code (Shadertoy)/a>/span> span stylewhite-space: nowrap;>a href/data/projects/femto/1920x1080.mp4>i classfar fa-video>/i> Video/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(tev-video).play() onmouseoutdocument.getElementById(tev-video).pause()> div classportfolio-image>a hrefhttps://github.com/Tom94/tev/releases>video muted loop preloadauto posterimages/projects/tev/768x432.jpg idtev-video>source srcimages/projects/tev/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/projects/tev/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a hrefhttps://github.com/Tom94/tev/releases>strong>tev/strong> — The EXR Viewer/a>/h2> p classvenue>p styletext-align: justify;>High dynamic range (HDR) image comparison tool for graphics people. strong>tev/strong> allows viewing images through various tonemapping operators and inspecting the values of individual pixels. To find differences, strong>tev/strong> can rapidly switch between images and visualize various error metrics./p>/p> p classresourcelist>span stylewhite-space: nowrap;>a hrefhttps://github.com/Tom94/tev/releases>i classfar fa-rocket-launch>/i> Binary Release/a>/span> span stylewhite-space: nowrap;>a hrefhttps://github.com/Tom94/tev>i classfar fa-code>/i> Code/a>/span> span stylewhite-space: nowrap;>a href/data/projects/tev/tev.bib>i classfar fa-quote-right>/i> BibTeX/a>/span>/p>/div>div styleclear: both>/div>/div>div idportfolio classportfolio-item onmouseoverdocument.getElementById(osutp-video).play() onmouseoutdocument.getElementById(osutp-video).pause()> div classportfolio-image>a hrefhttps://osutp.tom94.net>video muted loop preloadauto posterimages/projects/osutp/768x432.jpg idosutp-video>source srcimages/projects/osutp/512x288-hevc.mp4 typevideo/mp4; codecshevc>source srcimages/projects/osutp/512x288.webm typevideo/webm; codecsvp9>source srcimages/projects/osutp/512x288.mp4 typevideo/mp4; codecsavc1.4D401E>img srcimages/projects/osutp/768x432.jpg>/video>/a>/div>div classportfolio-text> h2>a hrefhttps://osutp.tom94.net>strong>osu!tp/strong> — Player Ranking System for the Online Rhythm Game a hrefhttps://osu.ppy.sh/home>osu!/a>/a>/h2> p classvenue>p styletext-align: justify;>Player ranking system based on automated analysis of gameplay patterns, which eventually became strong>a stylecolor: #EE84B5; hrefhttps://osu.ppy.sh/home>osu!/a>/strong>s official ranking system./p>/p> p classresourcelist>span stylewhite-space: nowrap;>a hrefhttps://osutp.tom94.net>i classfar fa-rocket-launch>/i> Live Version/a>/span> span stylewhite-space: nowrap;>a hrefhttps://github.com/ppy/osu-performance>i classfar fa-code>/i> Code/a>/span>/p>/div>div styleclear: both>/div>/div> /div> div classfooter-content> p styleclear: both>strong>454,728/strong> views | query took strong>0.0009/strong> seconds/p> div classthinline>/div> p>© 2016-2022 Thomas Müller i classfar fa-portal-enter>/i>/p> /div> /div>/body>/html>
View on OTX
|
View on ThreatMiner
Please enable JavaScript to view the
comments powered by Disqus.
Data with thanks to
AlienVault OTX
,
VirusTotal
,
Malwr
and
others
. [
Sitemap
]