Ray tracing was invented by Turner Whitted around 1980. His classic paper is really amazing; it had bounding volume hierarchies, adaptive sampling, and it's future work suggested randomized reflection. Since then it has branched into a ton of variations and is used in most movies for the indirect "bounce" lighting.
I've was assigned a 2D ray tracer in a physics class in 1984 and I was hooked. Since then I have written a bunch of ray tracers in Pascal, Fortran, Scheme, C, C++, Java, and most recently Swift (which is awesome for writing ray tracers). I've also taught classes with ray tracing about a dozen times, most recently at Westminster College in 2015. My approach over the years has evolved to do what I think are the most fun parts of ray tracing that are still in the direction of writing a production-quality ray tracer.
Ray Tracing in One Weekend
The resulting program generates this image:
Here's my MacOS code for the book. Should be semi-portable but drand48() will need to be written on some systems and the fileio may be too. I also have a little post on the tools I used to write the book for those interested in doing a little book of their own. Please let me know if you do!
Links for further reading and exploration related to the book:
Chapter 0: Overview
Here is a list of basic graphics texts that cover the vector background the book assumes:
Fundamentals of Computer Graphics, Fourth Edition
The Graphics Codex
Real-Time Rendering, Third Edition
Chapter 1: Output an image
For output of most LDR and HDR images I love stb_image.
For more info on HDR images and tone mapping the I like the book High Dynamic Range Imaging, Second Edition: Acquisition, Display, and Image-Based Lighting
Chapter 2: The vec3 class
I advocate using the same class for points, displacements, colors, etc. Some people like more structure and type checking (so for example multiplying two locations would be rejected by the compiler). An example article where points and vectors are different is here. Jim Arvo and Brian Smits experimented with not only distinguishing points and vectors, but using the compiler to do dimensional analysis (so velocity, length, and time would be different types for example). They found this to be too cumbersome in 1990s C++ but I'd love to hear about anybody's experience. Researchers at Dartmouth have taken a really serious effort at this and their code and paper are available at github.
Chapter 3: Rays, a simple camera, and background
The first thing you might add to your background function is an environment map. Paul Debevec has a terrific history of environment mapping. The easiest mapping for this uses a single image for the entire sphere of directions. Paul also provides some good images to use for your environment map.
Chapter 4: Adding a sphere
There are a bunch of other object types you can add. Triangles are usually first and I am a fan of barycentric methods. After triangles, many people quit adding primitives because graphics has such a big infrastructure for triangles. Ellipses are an easy thing to add but instancing is usually a more "ray tracey" approach (let the software do the heavy lifting). Composite objects via CSG are surprisingly straightforward.
Chapter 5: Surface normals and multiple objects.
If you want your code to be more efficient for large numbers of objects, use a BVH-- they are as good as any other in efficiency and are the most robust and easiest to implement.
If you want your code to be more efficient for large numbers of objects, use a BVH-- they are as good as any other in efficiency and are the most robust and easiest to implement.
Chapter 6: Antialiasing
Box filtering as done in the book suffices for most situations. However, Guassian-like filters can have advantages and are pretty easy. You can either uniformly sample the whole screen and weight the samples, or use non-uniform samples. All approaches work.
Chapter 7: Diffuse Materials
"Ideal" diffuse materials, also called "Lambertian" are used 99% of the time in graphics. The wikipedia article on this approximation is good. Real diffuse surfaces do not behave exactly as Lambertian (for example they get specular at grazing angle) but especially with interreflection in the mix the appearance differences are minor. So this is probably not where you should push your renderer until many many other features are addressed.
Chapter 8: Metal
The first improvement you might make is to have the color of the metal go to white at grazing angle. The Schlick approximation (used in Chapter 9 for glass where grazing behavior matters more) works for that. Full-bore Fresnel equations will describe color variation with angle, but in my experience getting normal incident color is good enough.
Chapter 9: Dielectrics
The first thing you might add is filtering of light within the dielectric (like the subtle "greenness" of much glass). This is a classic exponential decay and is covered well in the Beer's Law section of this nice page.
Chapter 10: Positionable camera
Camera parameter setting is just plain ugly. The system used in the book is relatively common and is in my opinion the prettiest. I avoid the matrix formulation wherever possible because I never understand my code when I am done. But it is very elegant and works.
Chapter 11: Defocus blur
If you ever need to match a real camera, there is a nice survey of models used in graphics. And here is a very good presentation on real lenses written for a technical audience.
Chapter 12: Where next?
You should have the core of a very serious ray tracer now. I would now take it in one of three directions. They are not mutually exclusive but explicitly deciding your goals will simplify architectural decisions.
- Make it physically accurate. This will imply using spectra instead of RGB (I like just using a big array of wavelengths) and get something where you know the reflectances. Popular is to get a X-Rite MSCCC ColorChecker Classic
whose data is available online.
- Make it good for generating animations. Lots of movies use a ray traced system, and Disney, Pixar, and the Solid Angle teams have both disclosed a remarkable amount about their code. Work on features and then efficiency. I think you will be amazed how soon you can produce amazing images.
- Make it fast. Here you can roll your own, or start using a commercial API. To see exactly where that community is now, go to the 2016 HPG conference. Or backtrack in their previous papers. They are a very open community and the papers go into much detail relative to many sub-fields in computer graphics.
You should consider publishing in iBooks too because it's easier to push updates. I imagine the book could be perfect but most computer books seem to have a few mistakes. It's just the nature of the topic.
ReplyDeleteThanks-- I already pushed an update so no not perfect! I'll look into iBooks. I think currently I have enabled some amazon features related to loaning and kindle unlimited free reading. I'll do some due diligence on the ecosystem once I finish part 2. I've had requests for pdf as well and there are a bunch of options there I don't yet understand...
DeleteSafeGI is a good example of the usage of dimensional analysis in a rendering system.
ReplyDeletehttps://github.com/ouj/safegi
Really interesting work. Thanks for the link. The github has the EGSR paper as well. I'll be very interested in the authors' thoughts on this in a couple of years as it is clearly a double-edged sword. I added your link above. Thanks!
DeleteIn the chapter 8, where you add the Metal material, can you provide the changes to the sphere class? The modifications to main() call sphere() with an additional 'new lambertian()' parameter, but I can't figure out what needs to be done in the sphere class to make it work.
ReplyDeleteThanks!
I'll take a close look at this part of the discussion. In the meantime, here is the final sphere class.
Delete#ifndef SPHEREH
#define SPHEREH
#include "hitable.h"
class sphere: public hitable {
public:
sphere() {}
sphere(vec3 cen, float r, material *m) : center(cen), radius(r), mat_ptr(m) {};
virtual bool hit(const ray& r, float tmin, float tmax, hit_record& rec) const;
vec3 center;
float radius;
material *mat_ptr;
};
bool sphere::hit(const ray& r, float t_min, float t_max, hit_record& rec) const {
vec3 oc = r.origin() - center;
float a = dot(r.direction(), r.direction());
float b = dot(oc, r.direction());
float c = dot(oc, oc) - radius*radius;
float discriminant = b*b - a*c;
if (discriminant > 0) {
float temp = (-b - sqrt(b*b-a*c))/a;
if (temp < t_max && temp > t_min) {
rec.t = temp;
rec.p = r.point_at_parameter(rec.t);
rec.normal = (rec.p - center) / radius;
rec.mat_ptr = mat_ptr;
return true;
}
temp = (-b + sqrt(b*b-a*c))/a;
if (temp < t_max && temp > t_min) {
rec.t = temp;
rec.p = r.point_at_parameter(rec.t);
rec.normal = (rec.p - center) / radius;
rec.mat_ptr = mat_ptr;
return true;
}
}
return false;
}
#endif
This discriminant does not work for me. When you removed the 2.0 (the 4 in the numerator and the 2 of the denominator) factors, the results are not valid any more. Am I missing something? from a mathematical point of view, I don't think that is correct, as the 4 is inside the square root affecting only one of the terms. Also, you already calculated the discriminant, you don't need to write it again inside the sqrt.
DeleteThere also appears to be a circular dependency between the mat_ptr field in the hit_record struct, and the material class. It's not clear how you resolved this.
ReplyDeleteThanks
I think the hit_record only needs to know that mat_ptr is a pointer but doesn't need to know the size of what it points to, so "class material;" at the beginning of the file where hit_record is defined takes care of it. (I say "I think" because I never trust my own compiler's acceptance of something that it works under the actual language spec.)
DeleteI looked it over and this is not mentioned in the book at all! I have fixed this and sent a new copy to amazon. It will probably update sometime today. Thanks for telling me about this!
DeleteThat worked, thanks! I can reproduce the images in chapter 8 now. On to chapter 9!
ReplyDeleteAwesome. Thanks for your help-- a book has to be debugged too and the readers are the compiler :)
DeleteSome more comments. Not sure how to specify location, as my Kindle reader just uses 'Location' -- hopefully that is universal:
ReplyDeleteLocation 283:
list[2] sphere color doesn't match color in the picture on next page (blue vs. red)
Location 314:
The formula for the 'discrimant' variable gives bad results for me. But when I change it to:
float discrimant = 2.0 - ni_over_nt*ni_over_nt*(1-dt*dt);
my results closer to yours.
Location 316:
attenuation = vec3(1.0, 1.0, 0.0);
should be:
attenuation = vec3(1.0, 1.0, 1.0);
Thanks! It looks like they are either close to universal or close enough.
DeleteOn 283 I think list[2] is the big sphere and (0.8, 0.6, 0.2) is some yellowish color. The blue sky will shift toward blue so the greenish pale "ground sphere should be ok?
I think that the formula in the book is ok. Here is what rtn says (rtn was a very active list among ray tracing nerds in the dark ages) http://www.realtimerendering.com/resources/RTNews/html/rtnv10n1.html#art3
Delete316: good catch! Thanks.
DeleteHey Peter,
ReplyDeleteI'm not sure if it's me or the code, but I couldn't get the refraction in the dielectric working right until I switched to passing unit vectors to the refract method. After that it matches the renders in the book perfectly.
Which vector? I do this in the first line of refract: vec3 uv = unit_vector(v);
DeleteYes, but the normal also seems to need to be 'normalised':
Deletebool refract(const vec3& v, const vec3& n, float ni_over_nt, vec3& refracted) {
vec3 uv = v.unit_vector();
vec3 un = n.unit_vector();
float dt = dot(uv, un);
float discriminant = 2.0 - ni_over_nt*ni_over_nt*(1.0 - dt*dt);
if (discriminant > 0) {
refracted = ni_over_nt*(uv - un*dt) - un*sqrt(discriminant);
return true;
} else {
return false;
}
}
I also had problems, but the fix was that the line which reads:
Deleterefracted = ni_over_nt*(v - n*dt) - n*sqrt(discriminant);
should read:
refracted = ni_over_nt*(uv - n*dt) - n*sqrt(discriminant);
Note the use of "uv" instead of the non-normalized "v". After making that one-character change I had the same results as in the book. I didn't need to normalize the normal.
I had the same problem, thanks for posting your fix!
DeleteUsing unit vectors for the refract method also worked for me -- thanks for the tip.
ReplyDeleteNot having too much time, it took me more than one weekend to complete the raytracer :) but I'm quite happy with it. I adapted the C++ code to JavaScript and it now runs inside a browser. It is a bit slow, but I think it serves its educational purpose very well.
ReplyDeleteI am curious about the camera parameters you used to generate the final image from chapter 12 (lookfrom, lookat,... , aperture, focus_dist). After some experimentation my image was similar to yours, but not quite so. In particular the perspective looks a bit strange, the small balls to the left were distorted like ovals.
Cool! Could you post a link?-- ray tracing in a browser is always fun.
DeleteThe camera I use could be wrong. Let me set up some unit tests.
Also, spheres projected to an image from ovals. If you get close enough to your image so your eye has the same field of view as your virtual eye does, you will see those ovals in perspective so on your retina you will get approximately a disk. In my first ray tracer I chased this "bug" for weeks...
Yes, I will post a link, but I need to clean up the code first and this will require another weekend.
DeleteHere's a link to my JavaScript version:
Deletehttps://www.googledrive.com/host/0BwNzpRS7-ABHeDFuRXJ6NmV0TFE/raytracer-ch12.html
This code will run in your browser, so beware that it is slow! I reduced the number of small spheres and the number of samples for antialiasing, but even so, it takes a few minutes to render the image on a relatively fast notebook.
The same directory also has versions that generate the images in each chapter. For example, for chapter 8:
https://www.googledrive.com/host/0BwNzpRS7-ABHeDFuRXJ6NmV0TFE/raytracer-ch08.html
Chapter 9 has two images, "ch09-1" and "ch09-2".
I think I can make this a bit faster by exploring some features of JS 6 (like typed arrays). But for now I'm just waiting for the scond book in the series :)
Would it be possible to elaborate more on the issue where spheres appear as ovals? I have also been chasing this issue for a few weeks!
Deletedargouder, there's a good explanation of this in the book "Real-Time Rendering" 3rd Ed (M öller et. al.) p 449. They explain that it's not an error and relates to a discrepancy between the field of view of the image and the FOV of your eye viewing the image on a monitor, so if the two FOVs match, the warping goes away. There should be some guides on the internet stating how far away from your monitor you need to be, given its width, to get a specific FOV. (I know Lengyel's "Mathematics for 3D Game Programming & Computer Graphics" has a discussion on this, and I would be surprised if Möller's book didn't also have a discussion on how viewing distance to a monitor relates to perceived FOV.)
DeleteA few comments:
ReplyDeleteIn location 139 the formula:
t*t*dot(B,B) + 2*t*dot(A-C) + dot(C,C) - R*R = 0
seems to be wrong (dot(A-C) doesn't even make sense).
By looking at the code I think it should be something like:
t*t*dot(B,B) + 2*t*dot(A-C, B) + dot(A-C, A-C) - R*R = 0
All the images I generate seem to have a fish eye effect unless I use a small angle in the camera (like 20 degrees). It also happens on most pictures of the book, where the sphere in centre looks fine but the ones on the sides look distorted. Is it common to use such a small angle in the camera or to do something to compensate?
Could anyone else replicate the bubble effect with the sphere with negative radius inside the glass sphere in 327? I even tried copying the code line by line and still couldn't get it (it's as if the negative sphere is not there at all).
What's your sphere intersection and normal code look like.
DeleteAnd good eye on the formula. The correct formula (which you got) is with luck in the updated version on the kindle store.
Noticed the same problem with the formula -- got Kindle version on Feb 23, 2016 and still not fixed.
DeleteAlso, just prior to that one there is "dot((p - C),(p - C)) = (y-cy)*(y-cy) + (z-cz)*(z-cz)" -- missing (x-cx)*(x-cx).
I also needed to normalize the normal vector in the refraction function. After that I was able to reproduce all images in the book (including the sphere with negative radius). As for the circular dependency, I just added 'class Material;' in Hitable file. Just in case, I've uploaded it to my github as https://github.com/netolcc06/BabyRayTracer.
ReplyDeleteYou code doesn't quite compile for me. What is your platform? I like that you templated vec3 and it would be fun to race floats and double....
DeleteI also like your drand48() substitute. Not needing to know how big rand()'s range is is clever. Mind if I steal your rand for the book?
You can steal everything you want. :)
DeleteI'm coding on Visual C++. A friend of mine also had trouble to compile my code on Linux. After I finish everything I want to add to the ray tracer, I'll port the code.
Besides, I've posted an image with many spheres to the main page of github. Check it out: https://github.com/netolcc06/BabyRayTracer
I've added planes and boxes. After triangles, textures and meshes, I intend to play with lights. Any hints on how to proceed?
Prof Shirley,
ReplyDeleteHope you consider writing a similar book for Swift. I would definitely buy it!
In your second dielectric code example, the first calculation of cosine multiplies by the ref_idx but the second calculation of cosine does not. Is that a bug or intentional? If it's intentional why is there a difference?
ReplyDeleteYou also calculate scattered = Ray(rec.p, reflected); twice.
Thanks! That first "scattered =" can go. The cosine is intentional and I should add some words on it in the book. The Fresnel reflectance is the same no matter which direction you are traveling in or out. The Schlick approximation to Fresnel reflectance is framed in terms of the refractive index = 1.0 cosine (the larger angle side) of the material. HOWEVER, it may be both intentional and a bug-- Snell's Law applies to sines. I will need to do some digging here. Great eye!
DeleteYep it's a bug! Here is the fix:
Deletehttp://psgraphics.blogspot.com/2016/03/my-buggy-implimentation-of-schlick.html
Once I clean it up a bit I will add it to the book.
Hi,thanks for writing this short book. I took a class in college and we did a ray tracing program so it was a bit of a refresher.
ReplyDeleteI implemented mine in Python to brush up on my Python skills.
Can you provide the camera parameters for the last image?
Thanks
Cool-- I will be interested to see how fast your python version is. The efficiency tools keep getting better.
DeleteHere is my main with all the scene parameters:
http://www.cs.utah.edu/~shirley/main.cc
I'll get some numbers. I modified it to do parallel processing to speed it up.
DeleteIt's pretty slow. For comparison, the Python version took 10min and 20 seconds. The C++ version took 10.5 seconds. My picture was 200x100 with a sample of 5. Single CPU for each program.
DeleteMaybe there is a bug in my Python code somewhere, but the pictures do come out the same.
Would it be possible to send me your python code? I am trying to create a small raytracing program for my school project. It's been quite difficult, mainly because most of the tutorials are for C++ or other languages (other than python). Would be great if you could send with explanations. Many thanks!
DeleteI found the discussion about diffuse material a little bit complicated. In addition, in this code
ReplyDeletevec3 target = rec.p + rec.normal + random_in_unit_sphere();
return 0.5 * color(ray(rec.p, target - rec.p), world);
why you add rec.p in target and after you subtract it in ray function?
You are right! I found that less confusing for myself when I wrote it, but I think your way is much better. I probably will need to change the figure but I think this will be a rare "100% better on all dimensions" code change....
DeleteHi! Awesome book for ray tracing and I enjoy my first tour very much! I am reading the refraction part, but I confront the problem that I cannot get rid of the black stuff(edge). Base on my standing, if the dielectric scatter fail it would return false to the calling color function, which will return vec3(0,0,0) and cause black edge. Any one with the issue?
ReplyDeleteSorry I missed this comment until just now. I have often had similar things come up and refraction I **never** get right the first time. I would start with known cases and step debug. Painful, but what I always have to do.
DeleteHas the new version of the book been published with the fixes for the comment above (regarding the missing mat_ptr)
ReplyDeletehttp://in1weekend.blogspot.com/2016/01/ray-tracing-in-one-weekend.html?showComment=1454599661068#c3278073328434186794
I ran into this problem as well and am looking for corrections. Also, is there a reason the source code for the book isn't available as reference?
This is a good point-- I was unaware when I started that kindle doesn't auto-update changes. I think I will make the code available as a partial fix. For now, here's my uncleaned version: http://www.cs.utah.edu/~shirley/rtbook1/
DeleteThe reason BTW was my belief that looking and typing was best. I am now thinking that is not a **good** reason :)
DeleteAwesome, thanks so much. I agree, I think looking and typing is the best as well, but my main nit is, when the code is embedding in images (and the type is really small), it is kind of hard to read, even when reading the book in the Kindle Web Viewer. I would prefer that that code was in some text form the I could then use the native zoom feature of the browser to enlarge the fonts properly.
DeleteAgain, thanks for making this book. I look forward to working through the others in the series as well.
Finished the book in Go. Was able to make a very parallel version: https://twitter.com/cwalker/status/727357592926711808
ReplyDeleteI'd be curious to see your implementation in Swift. I'm nearly done revisiting the book, in Swift this time, to learn both the language and your material on ray tracking more deeply. Trying to get it nearly as performant as the C++ implementation is interesting. I write Swift for my day job, but sadly I don't get to spend much time getting things to run fast (or understanding why they run slow), so this has been quite eye opening. I've found good use cases for optionals (returning a hit record) and for tuples (returning scattering, attenuation, and scatter success), and I can make initializers a little more terse when necessary and when intent is obvious, so that's kind of nice.
ReplyDeleteMy implementation in swift was VERY basic. It didn't compile anymore last time I checked. I will dig it up and get it to compile again. Email me a reminder if I don't post it soon!
DeleteWhats the point of this line of code
ReplyDeletevec3 p = r.point_at_parameter(2.0);
At the end of the surface normal chapter in main function?
p is not used in the rest of the function.
D'oh! It is dead code. Good eye. Just trying to make the compiler writers proud of that optimization.
DeleteThis comment has been removed by the author.
ReplyDeletelooks like first lower_left_corner calculation at the end of chapter 10 is pointless too.
DeleteP.S Great Book. Thanks
Programming is combination of intelligent and creative work. Programmers can do anything with code. The entire Programming tutorials that you mention here on this blog are awesome. Beginners Heap also provides latest tutorials of Programming from beginning to advance level.
ReplyDeleteBe with us to learn programming in new and creative way.
Hi Peter, thanks for this book, or really, thank you for book 3, because the material that book discusses is pretty hard to follow in the "reference tomes".
ReplyDeleteI wrote the dielectric as it was in the book, then thought the ref_idx term for the cosine was really strange. So it seems someone else here thought so as well and you had adressed it.
I am better with math than physics sadly, here's the code and my comments anyway if you wish to take a look.
https://github.com/skurmedel/strahl0/blob/master/inc/dielectric.hpp
I probably misunderstood everything, but I think the derivation for Snell's is correct :D
Great comments! Nice version of the code.
DeleteWould anybody attending SIGGRAPH 2016 be interested in a Birds of a Feather meeting? There's still time to propose one.
ReplyDeleteDr. Shirley, would you have any objection to a BOF session referring to "Ray-Tracing In One Weekend" in the title?
DeleteI'm really sorry I missed this until just now. Yes I have no objection. For future reference the answer to all such questions is go for it!
DeleteHi!, starting with your series of book.
ReplyDeleteIt would be great if you write something on the lines of: Rasterization/game rendering in one weekend/month or so.
I think it would be possible and even more demanded than this series.
Cheers!
I agree that book would have a market. I would buy it! I don't think I have the games background. But good idea.
DeleteHello, I was wondering if there was any chance you could also have this available in amazon.ca as a Canadian who is really interested in this series.
ReplyDeleteCheers,
Vishahan Thilagakumar
Hi, I decided to implement my version in Haskell. I'm pretty sure that means a harder time eliminating performance bottlenecks, but hey...
ReplyDeleteI can use both Float or Double in the implementation, but haven't really noticed a performance difference either way. One thing I did notice though is artifacts caused by rounding errors when I use Float, so I'll stick to Double as a default for now: https://github.com/tvh/raytracinginoneweekend.
Cool! I don't know Haskell but it's very readable.
DeleteI usually use double for my ray tracers. Precision with float is usually doable but requires care. However, lots of people will complain if you use doubles. But once you are using anything but C you will get hassled anyway :)
Yeah, that gigantic sphere was maybe a bit silly anyway, but I didn't get around to implementing other primitives yet. I managed to optimize the code enough so that the difference Float/Double is about 10% over the whole process now. Before it was completely shadowed by packing/unpacking of boxed values.
DeleteRegarding doubles vs floats, I've been implementing this in Swift, and noticed some dramatic (and very visible) difference in brightness in the diffuse materials example (ch7). I dug and dug and finally isolated it to float/double rounding errors.
ReplyDeleteThe largest, and the one that caused the brightness difference, was in sphere::hit. In the assignments of temp, using sqrtf rather than sqrt has a dramatic difference on the output. This really surprised me, since I assumed the impact of that would be detectible in the output, but too subtle to see easily.
So far, to get my Swift code to exactly match the C++ output, I've had to switch all sqrt() to sqrtf(), and also apply float casts here:
float u = float(i + float(drand48())) / float(nx);
float v = float(j + float(drand48())) / float(ny);
and here:
int ir = int(float(255.99)*col[0]);
int ig = int(float(255.99)*col[1]);
int ib = int(float(255.99)*col[2]);
(I could also explicitly promote the floats to doubles in Swift and then truncate back to floats, but that's a lot uglier than the equivalent C++.)
The float casts had a much smaller impact on the final results than those sqrt's in the discriminant.
Anyway, my suspicion is that the C++ would be easier to exactly match in other languages if it used double, since then there would be fewer implicit promotions/demotions and double tends to be the more natural type (i.e the type of literals) in many languages.
Thanks for the book; this has been a great weekend project over the holidays.
I, thank you for this awesome book, i have a little problem, i don't speak well english so i make a screenshot:
ReplyDeletehttp://i68.tinypic.com/2zyyk2x.png
:y display is reversed, it is a problem ? i try many thing thing but im not able to make it like yours.
Thank you for your help.
The link for the image http://imgur.com/a/hxQ8L
ReplyDeleteThink i have resolved my issue, it was the library i use inverse the order of putting pixel in image
DeleteAnd that happened all before I got to this. Sometimes procrastination pays :) Good luck with the project!
DeleteThis comment has been removed by the author.
ReplyDeleteHey, Pete. I'm curious about something in the Lambertian material. You scatter rays by sampling within a unit sphere that just touches the surface at the point of intersection. That's a nice simple technique, and I like it. But I'm wondering if it is equivalent to sampling the hemisphere over the point of intersection (which I haven't implemented, but seems more complicated to do). If they are not equivalent, is one more physically correct than the other? I suppose that depends on the desired material characteristics?
ReplyDeleteLambertian but I was sure it was diffuse (fuzzy but not necessarily a cosine distribution). In the third book I do it "right" and get a slightly different answer. On my to-do list is to see how close it is.
DeleteThis comment has been removed by the author.
DeleteThis comment has been removed by the author.
ReplyDeleteHello. Thanks for your great book series. All three books are simply fantastic. I hope this concept of self publishing gains momentum. And i really like the refreshing overall "straight to the point and keep it as simple as possible" approach while still conveying most of the fundamental concepts.
ReplyDeleteI have ported the "Ray Tracing in One Weekend" final ray tracer to glsl and put it online on shadertoys. With a webgl capable browser and a fast graphics card it is realtime:
https://www.shadertoy.com/view/lssBD7
Super fun! Thanks I will tweet that.
DeleteHi all dear guy!
ReplyDeletenice article great post comment information thanks for sharing.gclub casino
goldenslot casino
goldenslot
Hi! Thanks for the book, I am learning a lot and truly appreciate it. Question on the metal/reflect function: I tried what you had but got a black part of on the sides of the metal sphere that were not facing the inner sphere. I negated both terms and got something similar to what you had (albeit with something wrong on the reflection with the bottom sphere). What do you think I'm missing?
ReplyDeleteJust to be clear, if I replace the original Lambertian sphere and put a metal material on it, it is pitch black (this is before the diffuse factor is brought in). How would I be able to debug this issue?
DeleteHey, did you ever figure this problem out? I also notice the same thing after implementing the metal spheres. I've been messing around with the color function and I'm getting some better results but it still looks weird.
DeleteUseful Information, your blog is sharing unique information....
ReplyDeleteThanks for sharing!!!
resorts in godavari
resorts in costal andhra
resorts in konaseema
resorts in east godavari
resorts in andhra pradesh
EduwizzOnlineTraining is the Best Online Training Institute in Hyderabad, Bangalore. Eduwizz provide courses like Hybris Development, WebSphere Commerce Server,Blockchain Training,Hyperledger Fabric Development ,Ethereum Development ,Commvault Training, Devops , Netapps , Mulesoft ESB ,Machine Learning,Internet of Things , Hybris ,Angular JS , Node JS , Express JS , Business Analyst, Selenium testing with webdriver, Guidewire ,Adobe, RPA ,TSM, EMC...etc
ReplyDeleteHi! thanks again for the book. I'm using it to learn Rust along the way. I made it a spectral raytracer this time. Already looks cool when using a material with high dispersion like SF11. I'll have to implement some light sources and triangles next to play around with prisms.
ReplyDeletehttps://github.com/tvh/rayer-rs
Thank you for the book! I'm up to chapter 7 now and use JavaScript to render in a browser. You can see the output for a simple test case here https://rkibria.github.io/multiray.js/ and the github project is here https://github.com/rkibria/multiray.js
ReplyDeleteThe images take about 4 seconds each to render in Firefox on a 2.2GHz i7, and surprisingly 50% longer on Chrome.
The final random spheres demo image from the first book drawn live on a webpage: https://rkibria.github.io/multiray.js/randomscene.html
ReplyDeleteHi - I'm liking the book so far but I don't get the maths at kindle location 137 where you say 'The rules of vector algebra are all that we would want here, and if we expand that equation and move all the terms to the left hand side we get:'
ReplyDeleteI just don't get how you are going between the two equations involved... can you go in to any more detail or provide a link to a good, free reference on 'vector algebra' ?
Thanks
Lovely post...
ReplyDeletevector tracing service
I gotta favorite this website it seems very helpful .
ReplyDeleteescrow blockchain
Amazing job, thanks a lot for sharing. I wonder whether there is a small typo in the PDF when building up the sphere class extending from hitable, as parameter t derivation should be obtained dividing by 2*a and not a. Maybe I own an old copy of the PDF and the typo is gone.
ReplyDeleteAnyway, thanks a lot for sharing your knowledge!!
Actually it'd better read:
Deletefloat temp = (-b - sqrt(discriminant)) / (2*a)
float temp = (-b + sqrt(discriminant)) / (2*a)
than
float temp = (-b - sqrt(b*b-a*c)) / (2*a)
float temp = (-b + sqrt(discriminant)) / (2*a)
wow, amazing information thank you so much
ReplyDeletehttp://bon555.com/
bon555
พนันออนไลน์
เว็บพนันออนไลน์
แทงบอลออนไลน์
แทงบอล
เว็บพนันบอล ดีที่สุด
ปืนยิงปลา
Thank You For sharing such a nice information if Any one Searching for Wonderla Hyderabad Places information and Entry Ticket Booking Please visit <a href="https://www.bestbus.in/info/hyderabad-tourism”> Bestbus.in</a>
ReplyDeleteThis blog is more effective and it is very much useful for me.
ReplyDeletewe need more information please keep update more.
German Training in Amjikarai
german Training near me
german language coaching in bangalore
best institute to learn german in bangalore
Thanks first of all for the useful info.
ReplyDeletethe idea in this article is quite different and innovative please update more.
vmware training in bangalore
vmware courses in bangalore
vmware Training in Anna Nagar
vmware Training in Tnagar
can you tell me how should i look the image?? I am stuck on the very first problem, the code is there compiler is there but i dont understand how view the image?? what is the procedure?
ReplyDeleteYou can use GIMP to view it.
DeleteCan you tell me in detail? Should i compile it first and then use gimp to open the file?
ReplyDeleteThank for providing good information for site,Thanks for your sharing.
ReplyDeleteCASINO iwin89 ONLINE
Thank you for excellent article.
ReplyDeletePlease refer below if you are looking for best project center in coimbatore
soft skill training in coimbatore
final year projects in coimbatore
Spoken English Training in coimbatore
final year projects for CSE in coimbatore
final year projects for IT in coimbatore
final year projects for ECE in coimbatore
final year projects for EEE in coimbatore
final year projects for Mechanical in coimbatore
final year projects for Instrumentation in coimbatore
please tell me what should i do after writing the code
ReplyDelete