Friday, December 31, 2010

What is your current reason for living?

What is your current reason for living?

This question is answered here: http://briggs.id.au/jour/2010/12/what-is-your-current-reason-for-living/

(I now field questions here: http://briggs.id.au/jour/qanda/ )

URL: http://www.formspring.me/briggswill/q/2001611591

Imagine you are Atheist, and cannot believe in God no matter how much you want to, what would you live for?

Imagine you are Atheist, and cannot believe in God no matter how much you want to, what would you live for?

Answered here: http://briggs.id.au/jour/2010/12/qa-imagine-you-are-atheist/

(I now answer questions through my blog at http://briggs.id.au/jour/qanda/ )

URL: http://www.formspring.me/briggswill/q/2003873277

Thursday, December 30, 2010

In Heaven, those who endured bad things on Earth are comforted (http://ref.ly/Lk16.25) implying we’ll remember what we’ll be comforted for. Can you add your opinion to this?

In Heaven, those who endured bad things on Earth are comforted (http://ref.ly/Lk16.25) implying we'll remember what we'll be comforted for. Can you add your opinion to this?

Answered here: http://briggs.id.au/jour/2010/12/qa-in-heaven-those-who-endured/

(I now receive questions here: http://briggs.id.au/jour/qanda/ )

URL: http://www.formspring.me/briggswill/q/1995925874

Where has your formspring gone?

Where has your formspring gone?

I've migrated Q&A to my new blog site at http://briggs.id.au/jour

Questions can be asked through the form at http://briggs.id.au/jour/qanda

When formspring (finally) release their API I may consider integrating the blog and formspring more closely, but in the mean time, please ask questions through the blog.

Thanks.

URL: http://www.formspring.me/briggswill/q/1995738623

2010, an excellent year for raytracing!

What an exciting year this has been, for raytracing at least. There has been a huge buzz around accelerated ray tracing and unbiased rendering, in which the GPU has played a pivotal role. A little overview:

- Octane Render is publicly announced. A demo is released which lets many people experience high quality unbiased GPU rendering for the first time. Unparallelled quality and amazing rendertimes on even a low-end GTX8800, catch many by surprise.

- Arion, the GPU sibling of Random Control's Fryrender, is announced shortly after Octane. Touts hybrid CPU+GPU unbiased rendering as a distinguishing feature. The product eventually releases at a prohibitively expensive price (1000€ for 1 multi-GPU license)

- Luxrender's OpenCL-based GPU renderer SmallLuxGPU integrates stochastic progressive photon mapping, an unbiased rendering method which excels at caustic-heavy scenes

- Brigade path tracer is announced, a hybrid (CPU+GPU) real-time path tracer aimed at games. Very optimized, very fast, user-defined quality, first path tracer with support for dynamic objects. GI quality greatly surpasses virtual point light/instant radiosity based methods and even photon mapping, can theoretically handle all types of BRDF, is artefact free (except for noise) and nearly real-time. No screen-space limitations. The biggest advantage over other methods is progressive rendering which instantly gives a good idea of the final converged image (some filtering and LOD scheme, similar to VoxLOD, could produce very high quality results in real-time). Very promising, it could be the best option for high-quality dynamic global illumination in games in 2 to 3 years.

- release of Nvidia Fermi GPU: caches and other enhancements (e.g. concurrent kernel execution) give ray tracing tasks an enormous boost, up to 3.5x faster in scenes with many incoherent rays compared to the previous architecture. Design Garage, an excellent tech demo featuring GPU path tracing is released alongside the cards

- Siggraph 2010 puts heavy focus on GPU rendering

- GTC 2010: Nvidia organizes a whole bunch of GPU ray tracing sessions covering OptiX, iray, etc.

- John Carmack re-expresses interest in real-time ray tracing as an alternative rendering method for next-generation games (besides sparse voxel octrees). He even started twittering about his GPU ray tracing experiments in OpenCL: http://raytracey.blogspot.com/2010/08/is-carmack-working-on-ray-tracing-based.html

- GPU rendering gets more and more criticized by the CPU rendering crowd (Luxology, Next Limit, their userbase, ...) feeling the threat of decreased revenue

- release of mental ray's iray

- release of V-Ray RT GPU, the product that started the GPU rendering revolution

- Caustic Graphics is bought by Imagination Technologies, the maker of PowerVR GPU. A surprising and potentially successful move for both companies. Hardware accelerated real-time path tracing at very high sampling rates (higher than on Nvidia Fermi) could become possible. PowerVR GPUs are integrated in Apple TV, iPad, iPhone and iPod Touch, so this is certainly something to keep an eye on in 2011. Caustic doesn't disappoint when it comes to hype and drama :)

- one of my most burning questions since the revelation of path tracing on GPU, "is the GPU capable of more sophisticated and efficient rendering algorithms than brute force path tracing?" got answered just a few weeks ago, thanks to Dietger van Antwerpen and his superb work on GPU-based Metropolis light transport and energy redistribution path tracing.

All in all, 2010 was great for me and delivered a lot to write about. Hopefully 2011 will be at least equally exciting. Some wild speculation of what might happen:

- Metropolis light transport starts to appear in commercial GPU rendering software (very high probability for Octane)
- more news about Intel's Knight's Corner/Ferry with maybe some perfomance numbers (unlikely)
- Nvidia launches Kepler at the end of 2011 which offers 3x path tracing performance of Fermi (to good to be true?)
- PowerVR GPU maker and Caustic Graphics bring hardware accelerated real-time path tracing to a mass audience through Apple mobile products (would be great)
- Luxology and Maxwell Render reluctantly embrace GPU rendering (LOL)
- finally a glimpse of OTOY's real-time path tracing (fingers crossed)
- Brigade path tracer gains exposure and awareness with the release of the first path traced game in history (highly possible)
- ...

Joy!

Monday, December 27, 2010

Global illumination with Markov Chain Monte Carlo rendering in Nvidia Optix 2.1 + Metropolis Light Transport with participating media on GPUs

Optix 2.1 was released a few days ago and includes a Markov Chain Monte Carlo (MCMC) sample, which only works on Fermi cards (New sample: MCMC - Markov Chain Monte Carlo method rendering. A global illumination solution that requires an SM 2.0 class device (e.g. Fermi) or higher).

MCMC rendering methods, such as MLT (Metropolis light transport) and ERPT (energy redistribution path tracing) are partially sequential because each path of a Markov chain depends on the previous path and is therefor more difficult to parallellize for GPUs than standard Monte Carlo algorithms. This is an image of the new MCMC sampler included in the new Optix SDK, which can be downloaded from http://developer.nvidia.com/object/optix-download.html.




There is also an update on the Kelemen-style Metropolis Light Transport GPU renderer from Dietger van Antwerpen. He has released this new video showing Metropolis light transport with participating media running on the GPU: http://www.youtube.com/watch?v=3Xo0qVT3nxg



This scene is straight from the original Metropolis light transport paper from Veach and Guibas (http://graphics.stanford.edu/papers/metro/metro.pdf). Participating media (like fog, smoke and god rays) are one of the most difficult and compute intensive phenomena to simulate accurately with global illumination, because it is essentially a volumetric effect in which light scattering occurs. Subsurface scattering belongs to the same category of expensive difficult-to-render volumetric effects. The video shows it can now be done in almost real-time with MLT. which is pretty impressive!

Friday, December 24, 2010

Move over OTOY, here comes the new AMD tech demo!

June 2008: Radeon HD 4870 launches with the OTOY/Cinema 2.0/Ruby tech demo featuring voxel raytracing. It can't get much closer to photorealism than this... or can it?

December 2010: Radeon HD 6970 launches with this craptastic tech demo. Talk about progress. Laughable fire effects, crude physics with only a few dozen dynamic objects, pathetic Xbox 1 city model and lighting, uninspired Mecha design. It may be just a tech demo but this is just a disgrace for a high-tech GPU company. Well done AMD! Now where the hell is that Cinema 2.0 Ruby demo you promised dammit? My HD 4890 is almost EOL and already LOL :p

Sunday, December 19, 2010

Immigration


A Kiss Goodnight


GPU-accelerated biased and unbiased rendering

Since I've seen the facemeltingly awesome youtube video of Kelemen-style MLT+bidirectional path tracing running on a GPU, I'm quite convinced that most (if not all) unbiased rendering algorithms can be accelerated on the GPU. Here's a list of the most common unbiased algorithms which have been ported successfully to the GPU:

- unidirectional (standard) path tracing: used by Octane, Arion, V-Ray RT GPU, iray, SmallLuxGPU, OptiX, Brigade, Indigo Render, a bunch of renderers integrating iray, etc. Jan Novak is one of the first to report a working implementation of path tracing on the GPU (implemented with CUDA on a GTX 285, https://dip.felk.cvut.cz/browse/pdfcache/novakj8_2009dipl.pdf). The very first paper reporting GPU path tracing is "Stochastic path tracing on consumer graphics cards" from 2008 by Huwe and Hemmerling (implemented in GLSL).
- bidirectional path tracing (BDPT): http://www.youtube.com/watch?v=70uNjjplYzA, I think Jan Novak, Vlastimil Havran and Carsten Dachsbacher made this work as well in their paper "Path regeneration for interactive path tracing"
- Metropolis Light Transport (MLT)+BDPT: http://www.youtube.com/watch?v=70uNjjplYzA
- energy redistribution path tracing (ERPT): http://www.youtube.com/watch?v=c7wTaW46gzA, http://www.youtube.com/watch?v=d9X_PhFIL1o
- (stochastic) progressive photon mapping (SPPM): used by SmallLuxGPU, there's also a GPU-optimised parallellised version on Toshiya Hachisuka's website, CUDA http://www.youtube.com/watch?v=zg9NcCw53iA, OpenCL http://www.youtube.com/watch?v=O5WvidnhC-8

Octane, my fav unbiased GPU renderer, will also implement an MLT-like rendering algorithm in the next verion (beta 2.3 version 6), which is "coming soon". I gathered some interesting quotes from radiance (Octane's main developer) regarding MLT in Octane:

“We are working on a firefly/caustic capable and efficient rendering algorithm, it's not strictly MLT but a heavily modified version of it. Trust me, this is the last big feature we need to implement to have a capable renderer, so it's our highest priority feature to finish.”

“MLT is an algorithm that's much more efficient at rendering complex scenes, not so efficient at simple, directly lit scenes (eg objects in the open). However MLT does sample away the fireflies.”

“The fireflies are a normal side effect of unbiased rendering, they are reflective or refractive caustics. We're working on new algorithms in the next version that will solve this as it will compute these caustics better.”

“they are caustics, long paths with a high contribution, a side effect of unbiased path tracing. MLT will solve this problem which is in development and slated for beta 2.3”

“the pathtracing kernel already does caustics, it's just not very efficient without MLT, which will be in the next 2.3 release.”

“lights (mesh emitters) are hard to find with our current algorithms, rendertimes will severely improve with the new MLT replacement that's coming soon.”

“it will render more efficiently [once] we have portals/MLT/bidir.”

All exteriors render in a few minutes clean in octane currently. (if you have a decent GPU like a medium range GTX260 or better). Interiors is more difficult, requires MLT and ultimately bidir path tracing. However, with plain brute force pathtracing octane is the same or slightly faster than a MLT/Bidir complex/heavily matured [CPU] engine, which gives good promise for the future, as we're working on those features asap.

With all unbiased rendering techniques soon possible and greatly accelerated on the GPU, what about GPU acceleration for biased production rendering techniques (such as photon mapping and irradiance caching)? There have been a lot of academic research papers on this subject (e.g. Purcell, Rui Wang and Kun Zhou, Fabianowski and Dingliani, McGuire and Luebke, ...), but since it's a lot trickier to parallellize photon mapping and irradiance caching than unbiased algorithms while still obtaining production quality, it's still not quite ready for integration in commercial software. But this will change very soon imo: on the ompf forum I've found a link to a very impressive video showing very high-quality CUDA-accelerated photon mapping http://www.youtube.com/watch?v=ZTuos2lzQpM.

This is a render of Sponza, 800x800 resolution, rendered in 11.5 seconds on 1 GTX 470! (image taken from http://kaikaiwang.blogspot.com/):

11 seconds for this quality and resolution on just one GPU is pretty amazing if you ask me. I'm sure that further optimizations could bring the rendertime down to 1 second. The video also shows real-time interaction (scale, rotate, move, delete) with objects from the scenery (something that could be extended to support many dynamic objects via HLBVH). I could see this being very useful for real-time production quality global illumination using a hybrid of path tracing for exteriors and photon mapping for interiors, caustics, point lights.

Just like 2010 was the year of GPU-accelerated unbiased rendering, I think 2011 will become the year of heavily GPU-accelerated biased rendering (photon mapping in particular).

Saturday, December 18, 2010

Friday, December 17, 2010

If By “a Glitch” You Mean “Ignorance”

A rebuttal to “Creating a Glitch In The Industry” by Christian Nutt

Also a sideline commentary to Chris Pirillo’s ignorance.

 

Why is it that every time some big-shot in technology suddenly thinks they’ve figured out the magical missing piece of online virtual environments, they invariably make an ass out of themselves in public? Just because Stewart Butterfield knows a thing or two about how to let you share photos online (Flickr) doesn’t make him qualified in any regard whatsoever to lecture about what makes a virtual environment system successful.

 

“Post-Flickr, Butterfield has moved forward with plans to launch Glitch, which he hopes will become a successful social online world in a way different from traditional MMOs”, says Christian Nutt, a reporter for Gamasutra.com, but he’s already managed to nosedive this plane into the ground as far as I can tell.

 

From Stewart Butterfield:

Second Life... Yeah, "was."

 

Yeah. Yeah, well... In both cases, I think, there wasn't enough game context. Well, there wasn't any game context to take off. I remember the first time I ever installed Second Life and sat down, I was like... First of all, it was super fucked up then. I mean, it was really buggy... That was probably 2003.

 

And actually at that time, there was kind of a buzz. There was Second Life, then there was There, and then the Sims Online was about to come out. We felt like that was like, not "social games" in today's sense, but there was going to be this era of social games, and all of them busted basically.

 

I think There [did] just because they spent too much money, because otherwise a lot of it had really nice polish and nice feel. When you were talking to someone, they had a great way of doing eye contact and spreading people out in a group, so it was a good social experience.

 

But again there wasn't any game there, and it was all about these brands. I don't want to go into a virtual world and look at Gap shit, American Eagle T-shirts... It's just... I don't know, it seems kind of gross.

 

I'm sure you could write a psychology thesis on it or something like that, but you can't really role-play in that context. If you have real world brands in front of you, you can't... You can't invent a persona because you can make yourself look different and you can fly and stuff like that. I don't know, it definitely breaks the magic circle. There's no real opportunity for playing.

 

The first thing I notice is that this guy swears about as much as a sailor, and the second thing I noticed was his shallow understanding of virtual environments based entirely on archaic and limited exposure to the systems he feels the need to write off.

 

Imagine if I were to criticize the feasibility of the World Wide Web and Internet today based on a limited experience I had from 1998. Doing so would ruin any chance I had professionally to embark on an Internet related venture, and the only reason I could possibly raise venture capital would be due to the investors being as ignorant on the subject as I was.

 

Of course it would be in his best interest to discount any other virtual environment to date, merely because he’s attempting to make his own called Glitch. However, based on his limited experience (which is about as robust as a goldfish in an ocean), coupled with his penchant for comparing the worthiness of technologies today based on his limited exposure nearly ten years ago, I can promise you that Glitch will crash and burn under that misguided attitude.

 

World of Warcraft is a game.

Glitch is a game (or will be).

Sims Online is (was) a game.

Final Fantasy XI Online is a game.

 

Second Life is not a game. When you log into a virtual environment system like Second Life, there is no pre-planned goals for you to follow like “Collect 100 Coins and Save the Princess”. The goal of Second Life is essentially whatever you want it to be, and that takes a bit of imagination on your part, because you aren’t exactly being spoon-fed pre-made storylines or goals like a game would gladly do for you.

 

Sure there exist games with pre-planned goals in Second Life, but those are simply part of the greater whole of the virtual environment much like the World Wide Web happens to have pornography and sites like Newgrounds.com to play games. The whole is greater than the sum of its parts, and this understanding apparently escapes Mr. Butterfield with deft precision usually reserved for Ninjas.

 

NinjaTeaParty

… and they’re holding Mr. Butterfield’s common sense hostage

Sandbox Virtual Environments are asynchronous interaction and creation. Unlike a “social game”, this type of virtual environment thrives entirely on the creations of the users, and not some multi-million dollar digital art studio (Square-Enix). Everything you see in a virtual environment like Second Life was created by the users of that environment, and therein is the power of a virtual environment over a mere “game”. There may exist some companies that specialize in game creation within Second Life (such as MadPea), but it’s an exception to the rule and still is dictated by the user generated content rule of thumb.

 

I’ve heard the same sorts of misguided criticisms of Second Life from Chris Pirillo (lockergnome) who went off on a tangent recently about Second Life and how it was essentially overrun by gambling and porn, and how he didn’t understand the point or “what to do” when he logged in back in 2007.

 

 

It’s overrun with porn and gambling. I should know, I logged in 2007 for a few hours.– Chris Pirillo

 

Meanwhile, one of the residents of Second Life ( @OliverSzondi ) reached out to Chris and offered to take him on a tour, which Chris accepted, and then streamed live while criticizing every aspect.

 

 

It’s still broken and useless based on my half hour of using it. Also, get off my lawn! – Chris Pirillo

 

So here are two examples of high profile people recently talking about how Second Life is inferior, and how they simply “don’t get it”, and yet their experience with the system is under a handful of days and based on usage from many years ago which predispositions them with unfathomable bias. In the case of Chris Pirillo, we’re talking about a guy who immediately assumes since his experience with Second Life consists of a cursory glance that it can’t possibly be anything to do with him.

 

A word of advice, Mr. Pirillo – You are an avid user of the Internet and are a technology celebrity. The assertions and claims you make against Second Life are of the same merit as if somebody told you that the Internet is simply a wasteland overrun by pornography and gambling, and therefore should be discounted entirely. It is sweeping and wholly ignorant generalizations such as this which damage your credibility in the technology field and bring your ability to accurately assess the merits of technology under suspect.

 

You did not need a guided tour of Second Life to see the sights, Mr. Pirillo, you simply needed to click that Destination Guide button on the right side to see all of the interesting events and destinations. It’s one button to click, and it’s staring you in the face for the entire time you took the tour. Also, you took the time to adjust your graphics to minimum settings, yet it never occurred to you to uncheck “Autoplay Media”; therefore, it is by your own ignorance that every place you visit starts playing music, but not because the viewer is at fault.

 

I find it very interesting that you, Mr. Pirillo, chose to use Text to Speech to read off the text chat for you, but apparently it never occurred to you to turn on VoIP in the viewer to actually talk.

 

And therein is the point I’m making both in regards to people like Stewart Butterfield as well as Chris Pirillo. You simply cannot make an accurate assessment of a technology based on extremely limited exposure, and definitely not if that exposure was many years ago.

 

I'm sure you could write a psychology thesis on it or something like that, but you can't really role-play in that context. If you have real world brands in front of you, you can't... You can't invent a persona because you can make yourself look different and you can fly and stuff like that. I don't know, it definitely breaks the magic circle. There's no real opportunity for playing.

 

Again, I had to reiterate that last part of the quote from Stewart Butterfield because it outlines how substantially ignorant this man is on the subject, (Chris Pirillo obviously speaks for himself in the videos above… which is a shame because he really should have known to keep his mouth shut and apologize for being ill-informed and ignorant of the technology prior to talking about it)

 

Apparently Stewart Butterfield has never seen a RolePlaying Community in Second Life like INSILICO, and he has no concept of what the Second Life Destination Guide is about since his exposure to Second Life dates back to 2003 for a handful of hours. Clearly this man is not exactly a marketing genius if he is unable to understand the potential of incorporating the ability to allow real-world brands into the Virtual Environment as an offering.

 

Since there is no real opportunity for playing, as he asserts, then clearly MadPea Island and all of those games must be a figment of my imagination. I must have been a peyote licking lunatic (more than usual) when I was racing cars on a track, or enjoying the Craig Lyons concert yesterday night with Lindsay, because I was pretty damned entertained. Of course, multiplayer gaming such as this Bomberman game in Second Life must also not exist according to Mr. Butterfield.

 

 

I’m sorry, Mr. Butterfield… you were saying?

 

 

 

You’re going to have to speak up, Mr. Butterfield. This non-existent live concert is loud.

 

 

 

I’m sorry, Mr. Butterfield… I can’t hear you over how awesome INSILICO is.

 

 

The difference here is that Second Life is a creation of its participants, while games like Glitch will be a choose your own adventure with multiplayer. In this regard, virtual environments like Second Life (and generally any sandbox virtual environment) are vastly superior to any pre-chewed story line you could dream up.

 

Those who fail to “get” Second Life are the same people who have spent less than a week using the system. A word of advice for all technology professionals: If you have spent only a few hours in Second Life 4 – 8 years ago, you need to keep your mouth shut. Your ignorance on the subject is appalling at best.

 

I’m sure Mr. Butterfield will make an interesting game, but if I were to reliably forecast the fate of that game in ten years, it is highly likely to not exist or be in a state of smoldering digital wasteland.

 

The people who log into Second Life do not need uninspired, non-creative, ignorant people who are incapable of thinking for themselves or making their own destiny and story. While there are plenty of such people logged into Second Life already, it stands to reason that the level of willful ignorance displayed by individuals such as Chris Pirillo and Stewart Butterfield far surpass the average newbie of Second Life. How can we tell the difference?

 

They log in and ask somebody “How do I play this game?”

Wednesday, December 15, 2010

Real-time Metropolis Light Transport on the GPU: it works!!!!


This is probably the most significant news since the introduction of real-time path tracing on the GPU. I've been wondering for quite a while if MLT (Metropolis Light Transport) would be able to run on current GPU architectures. MLT is a more efficient and more complex algorithm than path tracing for rendering certain scenes which are predominantly indirectly lit (e.g. light coming through a narrow opening, such as a half-closed door, and illuminating a room), a case in which path tracing has much difficulty to find "important" contributing light paths. For this reason, it is the rendering method of choice for professional unbiased renderers like Maxwell Render, Fryrender, Luxrender, Indigo Render and Kerkythea Render.

Dietger van Antwerpen, an IGAD student who co-developed the Brigade path tracer and who also managed to make ERPT (energy distribution ray tracing) run in real-time on a Fermi GPU, has posted two utterly stunning and quite unbelievable videos of his latest progress:

- video 1 showing a comparison between real-time ERPT and path tracing on the GPU:

ERPT on the left, standard path tracing (PT) on the right. Light is coming in from a narrow opening, a scenario in which PT has a hard time to find light paths and converge, because it randomly samples the environment. ERPT shares properties with MLT: once it finds an important light path, it will sample nearby paths via small mutations of the found light path, so convergence is much faster.

- video 2 showing Kelemen-style MLT (an improvement on the original MLT algorithm) running in real-time on the GPU. The video description mentions Kelemen-style MLT on top of bidirectional path tracing (BDPT) with multiple importance sampling, pretty amazing.
Kelemen-MLT after 10 seconds of rendering at 1280x720 on a single GTX 470. The beautiful caustics are possible due to bidirectional path tracing+MLT and are much more difficult to obtain with standard path tracing.

These videos are ultimate proof that current GPUs are capable of more complex rendering algorithms than brute-force standard path tracing and can potentially accelerate the very same algorithms used in the major unbiased CPU renderers. This bodes very well for GPU renderers like Octane (which has its own MLT-like algorithm), V-Ray RT GPU, SmallLuxGPU and iray.

If Dietger decides to implement these in the Brigade path tracer we could be seeing (quasi) noise-free, real-time path traced (or better "real-time BDPT with MLT" traced) games much sooner than expected. Verrrry exciting stuff!! I think some rendering companies would hire this guy instantly.

Monday, December 13, 2010

Final arcade

Ok finished the arcade, had to delete a lot of it and probably still broke the tri count limit...oh well, I had fun.

Friday, December 10, 2010

Arcade High Detail

Some initial renders of part of the arcade machine I'm working on. And a photo of the original reference machine.





Voxels again

Just encountered a very nice blog about voxel rendering, sparse voxel octrees and massive procedural terrain rendering: http://procworld.blogspot.com

The author has made some video's of the tech using OpenCL, showing the great detail that can be achieved when using voxels: http://www.youtube.com/watch?v=PzmsCC6hetM, http://www.youtube.com/watch?v=oZ6x_jbZ2GA It does look a bit like the atomontage engine.

Wednesday, December 8, 2010

New app out of the oven




I just released a new app that :

  • Updates chromium for you
  • Install chromium for you in case its not detected
  • Updates self


Download it HERE

Monday, December 6, 2010

Bowling ball return

Just some renders of a bowling ball return I'm working on for a project.



Sunday, December 5, 2010

#SecondLife Shopping: Bax Coen Boots

A Holiday Shopping Review

SLURL: http://bit.ly/gxxM22




Bax Coen, posing by her new boot line: Prestige


Over the past few days I’ve been looking, in vain, for some sort of shopping guide when it comes to brand names in Second Life. At first, I thought it would be a good idea to check out TheBOSL.com Shopping guide but found that a majority of their “shopping guide” is defunct.


One would think that with a name like Best of Second Life, that the brand would automatically institute a level of excellence over its endeavors, and expecting that is what led me to the interesting and confusing scenario by which clicking SLURL after SLURL on the site led nowhere or to discontinued locations.


Maybe Frolic Mills is stretching himself a little thin these days, and the pressure of keeping his empire running is taking its toll? We can say for certain that Frolic is a very busy man, and has recently co-sponsored the SLAnthem.com Contest for SecondLife, and runs his BOSL Magazine (not to mention BOSL Radio… does that still exist? I dunno)


Faced with this situation, I would normally turn to something like the Second Life Destination Guide, but I’m a bit leery about trusting those recommendations. It’s not to say that the Destination Guide doesn’t provide interesting or outstanding locations to check out, but something seems a bit too cozy with the relationship between regular destination guide locations and Linden Lab to take it entirely at face value. Essentially it’s all filtered through Linden Lab employees who ultimately control whether or not they personally think the location has enough merit to bother listing it.


Of course, that’s also assuming you ever get the Destination Guide Submission Form to actually work. This alone makes me raise an eyebrow as to how, exactly, destinations are actually submitted or chosen for the guide to begin with.


I won’t even bother using the classifieds in Second Life as an indication of quality or popularity, as it’s essentially just buying your way to the top.


Considering these two obvious options are taken off the table, in my case one involuntarily and one by choice, I decided to look around and even ask friends what they thought would serve as a brand worthy of noting in SecondLife. Part of this conversation led to an in-depth discussion with Jewlie Deisel in-world of Kitten’s Studio about what locations were on her list when she takes avatars shopping.


Surprisingly, (or maybe not so much), there were a lot of locations that we both had, while others were new to each other. Maybe not new, but more like we had never considered.


Among those was a landmark Jewlie provided simply named Bax Coen Boots.


Bax Coen Model

The New Prestige Boots from Bax Coen: Priced Affordably at 875L


I’ve been to places like Stiletto Moody for shoes and boots before, because unsurprisingly they are “the brand” to own. However, upon stopping into Stiletto’s last night with my partner, Lindsay Heslop, I could only sit and wonder about how a place can realistically charge 2795L for a single pair of boots or shoes. At first I tried to justify it because of the brand and quality, but the more I thought about it, the more I realized that Stiletto Moody shoes and boots aren’t actually worth the price.


Obviously Stiletto Moody has the brand recognition, but I suspect that the owner is just taking advantage of that recognition by hyper-inflating the prices, knowing full well that the brand will compel people to still purchase. Stiletto Moody is a fine shoe and boot, but other than the brand name there is nothing I can see that makes them worth the retail price.


Which brings me back to Bax Coen Boots.


Previously I had not known about Bax Coen, so I was a little skeptical when Jewlie told me about the location. Clearly I had not heard any marketing or classifieds, nothing on the Destination Guide radar, and no mentions in the mainstream media for SL. Of course, this brings me back to why I don’t entirely trust the Destination Guide from Second Life to clue me in to quality locations for shopping. Sure there is often something interesting in the guide, but I have yet to see Bax Coen. Of course I did happen to see Stiletto Moody listed in Destination Guide… go figure.


Let’s say for the time being that Bax Coen is exactly the reason I put the Destination Guide to the side and looked for personal recommendations.


Customer Service


Most places you go in Second Life, you will find to be barren and self-serve. Let’s face it, shopping in Second Life is a “fend for yourself” affair, with countless walls and displays on automatic. There is obviously the marketplace online, but that takes it to the level of catalog shopping.


Not so with Bax Coen Boots. Shortly after arriving with my partner, while we were looking over the new line of boots, we read the notice stating that all boots will be gladly fitted free of charge in the store upon request. “Well,” I thought, “that would be some amazing customer service…”; at least customer service compared to most stores in Second Life.


Sure enough, soon after a customer showed up and bought some boots, Bax Coen herself showed up to gladly help the customer fit her new boots.


Bax Coen_013


Ok, this right here is what floored me. Could it be? Actual customer service in Second Life?


Well, hells bells. Not just customer service but the owner and proprietor of the store actually glad to help her customers with their purchase in the actual store location.


I think this broke my brain for an undetermined amount of time, because after waking up from the blackout I recall hearing Bax Coen actually offering to help my partner (Lindsay Heslop) to fit her new boots for her.


Let’s put this into perspective.


Prior to stopping into Bax Coen, Lindsay and I had wandered through Stiletto Moody looking at boots and shoes that were roughly 2795L for a single pair, and nowhere to be found at the store location was any semblance of customer service or employee. Upon visiting Bax Coen Boots, it was a polar opposite experience.


Boots at Bax Coen (the ones Lindsay and I were looking at) were priced reasonable at 875L per pair, they had a remote HUD, the texture and sculpt quality were impressive (and I’d personally say on par with Stiletto Moody). For a fraction of the price of Stiletto Moody.


Out of curiosity I began to wonder what the Stiletto Moody price range would purchase at Bax Coen, and my question was quickly answered by a sign next to the same boots proclaiming 2699L for a Bax Coen Fatpack.


So what does the Fatpack get you for 2699L?


4 Styles of Boots with 13 color choices, and an L$801 savings.


I’m no rocket scientist, but the math is easy to add up. Not to mention the prior offer of having a live representative come to the store to fit your new boots for you.


I know I’m using a lot of large type and bold in this post, but surely you understand why this is exciting news. How many times have you gone shopping in Second Life and have had live representatives eager to help you with your purchases? It doesn’t happen very often, but I’m fairly sure Earthstones has the occasional representative available.


I was curious about whether this was just a fluke in timing coincidence, so I came back to the location later on only to find there was a customer service representative eagerly helping more customers as they wandered through the store. Even eight hours later, there was a customer service representative at the store within minutes of my arrival.


Bax Coen Customer Service

Peace Edenflower (right) helping a customer at Bax Coen Boots.


One of the reasons Lindsay dislikes Stiletto Moody (I know, shocking that anyone could somehow dislike Stiletto Moody’s) is because other than the obscene prices, the shoes don’t actually look right on her feet. Yes, you can adjust the shoes and move them around, but that doesn’t stop the invisible prims from making part of her ankles invisible regardless of where she places the shoes.


It could be an unfair assessment, but for nearly 3000L I don’t expect a pair of shoes to have anything wrong with them. At the very least, you would think for the money they are charging for those shoes they could hire some customer service reps to actually be at the store to help out.


Which brings us once again back to Bax Coen.


If you look at the picture in the beginning of this post, you’ll notice a sign behind Bax inquiring about whether the customer has Viewer 2.x. Seems she’s offering an Alpha Layer for her boots to mask the feet. Great idea (and I’m sure other shoe makers are following suit).


If only Bax Coen would make shoes as well, then I would call that store a one stop shop for all things shoes. She’d sure as hell give Stiletto Moody a run for their money. In the area of boots, I’d say she already has Stiletto Moody beat. Bax Coen may not have the endless aisles of shoes and boots that Stiletto Moody has, but what she does have is of high quality and reasonably priced, with outstanding customer service to boot (pun intended).


That’s why I raised my personal rating for this location from 3 stars to 5 overall. The only thing that’s lacking at this location is more selection, but barring that, I couldn’t recommend Bax Coen enough for your holiday shopping this year.


Every woman in Second Life should have at least a single pair of Bax Coen boots in their inventory, and unlike Stiletto Moody, you can afford to have a full collection. Everybody loves great prices and helpful staff :)


Selection: ★★★★

Could use a bit more of a selection, maybe some shoes. But what she offers is still excellent.


Service: ★★★★★

Prompt, courteous, and cheerful. Upon arriving at three different times, a representative was eager to help within minutes.


Quality: ★★★★★

It’s not Stiletto Moody, but the quality is impressive for the price range.


Price: ★★★★★

875L for a pair of quality women's boots. How could you go wrong?


Do you have any shopping locations you’d like to share? Drop a note in the comments! I’d love to check them out. I’m looking for high quality, affordable prices, and locations that aren’t part of the mainstream Destination Guide or press coverage.

Saturday, December 4, 2010

Steve Studies

I told a friend of mine that I would work on turning him into a video game character. I was having issues getting motivated to be creative and do homework so I started drawing just to get going and realized I had forgotten to work on my friends character design.

This here is just a few sketches from my sketch book, some face studies, ear studies, mouth studies, etc...

The ultimate plan would be to concept an entire character and then model it in 3D.

Thursday, December 2, 2010

CIA grans

I love the security random words/letters you have to put in to post a comment sometimes. This one came up today. Made me think of CIA Grandmas.. showing photos of their many grandchildren. My Mum does that *sigh*

I hope CIA Grans never becomes a movie. I don't want credit for the idea if it does.

So eight posts since the first monkey's and not a single hair of an ape. I wouldn't blame you if you stopped following right now, I would if I were you. The world is full of false promises and lies, you don't need to go to an art blog to have your dream shattered once more. Go on, I wont mind, I'm sure there are millions of other monkey blogs, probably even smarter ones.

Unless I promise that the next post.........after this one will contain a monkey. Will you stay?

To anger you chimp fans even more here's some arty photo's of me at work at the rock that is black. I'm method acting in the second shot.

Wednesday, December 1, 2010

TikiRender

Quick modeling of the tiki statue with some basic lighting.

Loyola Marymount University in #SecondLife

When: Thursday, December 2nd @ 7:45PM EST


Where: Loyola Marymount University SLURL


Why: I’ll be attending as a guest speaker


Notice: The following post is quite long. If you’re not into reading today, then feel free to stop at the first question marker in the post and call it a day. Pictures for this post were taken at LMU Psychology Island in Second Life using Kirstens S20 (42), all shaders enabled except Global Illumination. (Depth of Field is also enabled)


LMU Psychology Auditorium - All Shaders 1


Sometimes the future is a scary thing to think about. Especially when we take into consideration that this Thursday I’ll be the guest speaker at Loyola Marymount University in Second Life. Just imagine a room full of college students from various disciplines, all eagerly awaiting their turn to ask me questions about various technology topics and virtual environments.


Yeah, I had to stop and think about that too. Is it really a good idea to let somebody like me play a part in shaping young minds? I really thought about this when writing the book chapter as well, and the best answer I could muster was “As long as these students have the idea to think for themselves”. As an aside, the book seems to have finally been released (I noticed it available on Amazon recently).


It’s one thing to be an academic for a class at a university or giving lectures for business, but something always made me uneasy about the prospect of influencing future generations of young minds. Nobody really knows the future, and the best we can ever hope for is an educated guess. This is why I sincerely hope that the students attending on Thursday do not take all that I say as gospel and are willing to challenge and push further on their own.


Dr. Richard Gilbert (Professor of Psychology at LMU) is a really interesting guy to say the least. He’s the head of the P.R.O.S.E. Project at LMU (Psychological Research on Synthetic Environments) but even more interesting is that this man has a Grammy Award for co-writing a song in the movie Flashdance (1984). Naturally, this is the same guy behind the SLAnthem.com contest and I can only sit and wonder what sort of life this guy has led to bring him through such accomplishments.


This is the man who approached me about being a guest speaker for one of his classes, and I gladly accepted (as a bonus the college is offering an honorarium for the time).


The prospect of speaking for this class didn’t seem too out of sorts when I accepted, but then I began to really think about it. Being the professor of the class, Dr. gilbert would naturally assign homework and some research to the class prior to Thursday in order that they prepare questions and topics to converse with me about. That alone is what got me…


Trying to wrap my head around the fact that a class full of students from various related subjects are busy, as I speak, doing homework related assignments centered around my being the guest speaker on Thursday. I can imagine twenty or more students (maybe) sitting in their dorm rooms tonight and researching ideas and questions with me on their mind.


Maybe it’s a bit of empathy to be putting myself in their shoes?


This is, after all, college. So chances are that most of those students are probably drunk and partying right now. (laughs).


LMU Psychology Island with All Shaders


Dr. Gilbert has, thankfully, provided me with a list of expected topics that will be covered over the course of the two hour class. As an addition, the class will break for a short recess about halfway through (at least that is what I was told). I don’t know how well I’d hold up with two hours straight of students barraging me with questions and conversation, but to be honest I seemed to handle well at Friday Night Talk Show (which went on for nearly 4 hours of audience conversation).


Let’s take a moment to go over the topics presented to me for Thursday nights class:


1. Issues of Server architecture, so you can address the cascading structure you advocate.


This topic stems from my advocating of a hybrid decentralized server structure in order to properly handle load and bandwidth through massive, parallel fabric computing. On the surface, it sounds a lot like I’m suggesting everything to be done on the Cloud, but it’s a bit more than that.


With Cloud Computing (as we’d normally expect) there is still a centralized datacenter someplace. The only thing that’s really changed is the hardware and how it is utilized, in so much as that the software is entirely executed server side and streamed to the user via a client. Cascading Architecture is like an evolution of Cloud Computing, because it assumes that not only is the central data center involved, but each individual user in the system is also a repository and relay.


Each virtual environment user has redundant information, called Cache. This information can be readily passed along to others nearby in a virtual space without the need to ask a central server for that redundant information. There is also the idea of telling a central server that you have moved, in order that the server can tell 50 other users near you that you’ve moved (which to be honest seems silly).


Could we not connect via a cascading architecture in virtual peer clusters, thus informing each other of actions which do not need authorization? Surely 100 simultaneous users in an area are capable of relaying this information to each other, not to mention sharing their redundant cache data as well.


Better yet, why do we construct simulation systems in a manner that requires brute force and a lot of bandwidth centrally? Surely by now we would have realized this will ultimately fail to scale.


2. A status report on the quest for a universal format for 3D, ala HTML and JavaScript for 2D


Being part of the IEEE Virtual Worlds Standards Group, I can say that the closest thing that has been agreed on for a universal 3D Format has been Collada files. Past that, I have yet to see anything else solidly proposed.


My younger brother (who is in college) proposed once that files like Collada can be reduced algorithmically to a single number and decimal, to be decoded and expanded with resolution via reversing the algorithm. While this would require brute force computation on the part of the user in order to utilize those files, it does offer an interesting glimpse into procedural methodologies for indefinite fidelity.


We see today such things with Allegorithmic.com and their procedural textures system. 2MB of texture data in 2KB. I’ve seen some interesting things from DirectX 11 and Tessellation algorithms to increase the fidelity of 3D models farther than what the model data had stored. I really think these procedural methods for fidelity are the future.


3. Graphical Developments and progress toward photorealism


Which, of course, leads us into this topic. As I had said, I truly believe that procedural methods will win out in the end. Interesting enough, shortly after I had mentioned the push for photo realism in Second Life, I saw a video post by @oobscure on twitter about the Depth of Field viewer in the development channel. It looks really nice in the video, and first hand it’s just as stunning when combined with the shadows, lighting, SSAO, and Mesh abilities, as seen in the snapshots in this post from Kistens S20(42).


However, there is still quite a lot of progress to be made in the photo realism department. I will obviously cite that higher realism requires more computational power, and this doesn’t change. I believe, though, that we are quickly reaching a point whereby older methodologies for graphical abilities must be approached in a more intelligent manner.


Static images aren’t as good for fidelity as procedural dynamic textures, and what we think is high definition today at 1024x1024 or 2048x2048 resolution in a PNG or TGA is really very low resolution comparatively. I do understand that graphics cards aren’t really meant to handle 10,000x10,000 resolution textures, but who is to say that they actually have to handle the entire image all at once?


.debris Procedural Demo | 177kb Executable Size


Therein is why I believe Procedural methods will work in the end. You can essentially have a 32,000x32,000 resolution texture in 10kb seamless, but the algorithm involved knows only to show you the highest resolution you can comfortably see on your graphics card, and only the area you can actually see (as in, not trying to load the entire grass texture for the State of California all at once).


It’s all about intelligently streaming only the information we need at any given moment.


4. Developments in shared media and integrating 2D applications into immersive settings


I’ll be leaving this one to answer in class, but you can assume I’ll talk a bit about the developments with the koios media center in Second Life.


5. Current Comparisons between SL and other platforms


I’d say Second Life is a median system. Kind of like choosing Mario in SMB2 when you have Luigi, Princess and Toad at your disposal. Graphically it’s starting to catch up to things like BlueMars without having to go into graphical overkill. It’s fairly powerful as a platform, open enough to do many things, and it has average strengths. For the time being, Second Life is the all around solution I’d recommend for virtual environments.


However, this doesn’t mean I’d split hairs and differentiate between SecondLife and OpenSim, InWorldz, SpotOn3D, etc. It’s all essentially based on the same underlying technology despite the bells, whistles and pinstripes painted on the sides.


Other technologies I’ll cover in class.


6. Your projections for the near and moderate term future for SL and the wider field of 3D worlds.


Concerning Second Life, I’d say I believe that the open source community will probably make many more strides to push the technology forward than Linden Lab will. That’s not necessarily a bad thing, as I really do like Open Source software and crowd sourcing. As for the wider field of 3D Worlds… I’ll cover that in class.


I will say, however, that I don’t believe that virtual worlds on their own have a future. Like any good technology, it matures and becomes ubiquitous. What the future is, concerning virtual worlds, is not virtual worlds in the sense that we know of them today.

OnLive just works, even for those darn Europeans!

I think this deserves it's own post. Someone (Anonymous) told me that the OnLive service can be accessed and played from EU countries as well, so I gave it a try and downloaded and installed the tiny OnLive plug-in. To my surprise, I actually got it working on a pretty old PC (just a Pentium 4 at 3GHz). I was flabbergasted. I'm about 6000 miles away from the OnLive servers and it's still running! The quality of the video stream was more than decent and smoother than when I try to decode 720p Youtube videos which my system just cannot handle.

My first impression: I love it, now I'm absolutely positive that this is the very near future for video games. It's a joy to watch others play and to start playing the same game within seconds! I've tried some Borderlands, Splinter Cell Conviction and FEAR2. There is some lag, because I'm about 6000 miles away from the OnLive server (I got a warning during log-in that my connection has huge latency), but I could nevertheless still enjoy the game. About half a second (or less) passes between hitting the shoot key and seeing the gun actually shoot, and when moving your character . I must say though that I got used to the delay after a while, and I anticipated my moves by half a second. My brain notices the delay during the first minutes of play, but I forgot about it after a while and just enjoyed the game. I think that if I can enjoy an OnLive game from 6000 miles away, then US players, who live much closer to the OnLive servers, have got to have an awesome experience. The lag could also be due to my own ancient PC (which is not even dual core) or to the local network infrastructure here in Belgium even though I have a pretty big bandwidth connection. I can't wait until they deploy their EU servers. Image quality is very variable, I guess it's partly because of my PC, which cannot decode the video stream fast enough. FEAR 2 looked very sharp though. The image looks best when you're not moving the camera and just stare at the scene. The recently announced MicroConsole seems to offer very good image quality from what I've read.

I think that cloud gaming will give an enormous boost to the graphics side of games and that photorealistic games will be here much sooner thanks to cloud rendering and it's inherent rendering efficiency (especially when using ray tracing, see the interview with Jules Urbach). My biggest gripe with consoles like Xbox and Playstation is that they stall graphics development for the duration of the console cycle (around 5 years), especially the latest round of consoles. With the exception of Crysis, PC games don't make full use of the latest GPUs which are much more powerful than the consoles. I just ran 3DMark05 some days ago, and it's striking me that this 5-year old benchmark still looks superior than any console game on the market. I truely hope that cloud gaming will get rid of the fixed console hardware and free up game developers (and graphics engineers in particular) to go nuts, because I'm sick of seeing another Unreal Engine 3 powered game.

I also think that OnLive will not be the only player and that there will be fierce competition between several cloud gaming services, each with their own exclusive games. I can imagine a future with multiple cloud gaming providers such as OnLive, OTOY, Gaikai, PlayStation Little Big Cloud, Activision Cloud of Duty, EA Battlefield Cloud of Honor, UbiCloud, MS Red Ringing Cloud of Death (offering Halo: RROD exclusively), Valve Strrream... To succeed they would have to be accessible for free (just like OnLive is now), without monthly subscription fees.

All in all, it's an awesome experience and it's going to open up gaming for the masses and will give a new meaning to the word "video" game. The incredible ease of use (easier than downloading a song from the iTunes Store) will attract vast audiences and for this reason I think it's going to be much bigger than Wii and will completely shake up the next-gen console landscape (Wii2, PS4 and Xbox 2.5/720/RROD2/...). MS, Sony and Nintendo better think twice before releasing a brand new console.

Be it OnLive, OTOY, Gaikai or any other service, I, for one, welcome our new cloud gaming overlords!