Sunday, August 31, 2008

We Have Normaility... but what is normal anyway?

The auto-redirect seems to have been fixed, thanks to the hosting administrators. Some upgrades were going on with the servers and a few things got slightly messed up in the process. In one instance, an entire server hard drive took a nose dive (luckily it wasn't ours).

So let's recap for a moment:

Site was redirecting to a GreenCard Application site [supposedly the redirect flag on the server was turned on for everything instead of just the 404 pages]

Checked the site this evening and everything seems to be in order again.

If you notice anything out of the ordinary with the site again, feel free to give us a heads up so we can try to find out what is going on. We would again like to personally thank Dr. Duke and Keith Thomas for giving us the heads up about the redirect issue.

And now we resume our regularly scheduled programming :)

Darian "Is It Just Me, or Do Pokemon Seem Completely Non-Plausible?" Knight

Saturday, August 30, 2008

US Greencard Lottery?

Thanks to a handful of our website visitors we've been informed of the odd redirect issue on our site. Visitors to the site are being redirected to US Greencard Lottery inexplicably.

We've been looking into the issue and talking with the hosting admins to figure out what the issue is and how best to remedy it.

In the meantime, for those who can see the website as it normally is, please forward this message to the rest of the beta group and let them know a fix is in the process.

Our sincerest apologies for the downtime.

Quick Explanation:

Apparently there was a hardware failure on Box15 of the host, and while Andromeda is hosted on Box12, our suspicion is that during the restoration of Box15, other servers in the vicinity were also included in the restoration process just to be on the safe side.

The site will be temporarily down until things are restored, but we are getting reports that some visitors are now returning to original functionality and able to view the site uninterrupted.


Wednesday, August 27, 2008

BCA Blessings


We received our latest Prayer Notes and The Real Australian today - the regular publications of Bush Church Aid. We now part of the BCA family.
Fantastic! What a blessing.

Tuesday, August 26, 2008

Number 1: The KLF

I've been a bit negative recently, so this is number 1 in an occasional series of things that make me happy.

What Time Is Love America

Playing guitars, in armour, in a storm, on a viking longboat, the Justified Ancients of Mu Mu remind us what time love is:



Justified and Ancient

They're justified, ancient, ride an icecream van (make mine a 99 please), and persuaded Tammy Wynette to get involved in their silliness. I like to sing this one at my local karaoke haunt.



Doctorin' the Tardis

An early appearance under the name "The Timelords". This was my first exposure to Drummond and Cauty. An odd blend of The Glitter Band and the old Who theme.



More info on the KLF here

Friday, August 22, 2008

Formation for Transformation - Money, Sex, Power

It's been a policy of ours for sometime now to include within our annual preaching program at least one teaching time on money and stewardship and one "topical" series. Our other teaching series are "expositional" in that they work through books of the Bible (or at least biblical themes) and expose God's Word to us and us to God's Word.

During this year, as part of our focus on taking responsibility for our response to God, on seeking to grow spiritually, and therefore to be formed to be part of God's transformational intent for the world - we have needed to go back to basics. And so we have looked not just at money - but at the three key human drives of Money, Sex, and Power. We've stolen the title from a book by Richard Foster which I've reviewed on my other blog.

It's in these three areas - money, sex, and power - that we run into and expose our spiritual immaturities, our weaknesses, and propensities to sin - our "flesh" or "sinful nature" hanging on at the gut level of who we are. My attitude towards money and wealth, the way I exercise my sexuality, and the level and manner of control and power I exercise in my life are good places to help determine spiritual maturity.

During this sermon series we are spending two weeks on each topic - the first topic has been power, leadership, authority and control; the second money, wealth, and stewardship; the third sex and relationships. For each topic the first week has been spent giving a time of teaching and bringing to light some of what God tells us through Scripture about each topic. And then we have invited questions to be asked for email, SMS or facebook so that the subsequent weeks is presented as answers to the questions that people are asking.

We have done power and money and are just about to do sex. And the questions have been though-provoking, challenging, and relevant. Are you meant to seek authority and influence? If so, how? What do I do with the money God has give me? Am I meant to tithe 10%, why and for what purpose?

We have tried to be direct and as honest as possible. Some of the feedback to me has been that it has been "uncomfortable but in a good way." You can see that for others there is a dawning about some of the ways in which the 24-hours-a-day 7-days-a-week nature of following Jesus confronts them and convicts them. Above all, but dealing with these issues we are facing the fact that this isn't a game, and church isn't just a nice place to go on a Sunday, but its about offering our lives to God seriously and totally and provoking one another to righteousness.

In some ways this basic stuff is just milk - following Jesus with the basics. But it's not meant to be rocket science. It's just a matter of dying to self and growing for him. We've got more growing to do, through more dying to ourselves and living for him. Our constant prayer - don't let us get away God - deal with us and make us your own in everything we do.

Thursday, August 21, 2008

Prayer Points - August 2008

These prayer points we just sent to our prayer point mailing list. Please contact me if you would like to be added to that list:



Thankyou for continuing to support us by praying for the Connections congregation and the work of the gospel here in the North-West of Tasmania. This email list started off in preparation for the visit by the Ridley Mission Team - and it's been almost a year since that happened. The work, and prayer continue. Here are some points for this month:

1) We have been, and are being blessed, by a number of milestones and joyous occasion in people's lives. We have three engagements in our midst at the moment, all to be married before the end of the year. Babies are being born, people are being baptised and confirmed. It's great to see God at work in that wonderful mix of the ordinary and extraordinary that make up our lives. Please pray for those who will be at the centre of this occasions over the next few months.

2) Please pray for us as we go about organising some new programs that will help reach new people, and provide some of the foundational teaching and formation that people are thirsty for. Please pray for those preparing for baptism and confirmation as they go through the "Jesus All About Life" course. Please pray as we identify and train up facilitators for a Lifekeys "Search for Life" course and for those doing the organising. Please pray as we begin the early stages of having a clearly defined program for raising up and training leaders.

3) Please pray as we go about finalising arrangements for a Youth Worker and a Children's Worker and begin advertising. Please help us to "pray in" these harvesters.

4) Arrangements for using the local school as a Sunday venue are at the initial stages of being organised. Please pray for those participating in the negotiations and discussions and pray for favour with the various boards and committees that will need to give approval.

5) With a number of new comers and other factors, we are learning the importance for each of us to take responsibility for helping build a cohesive and unified church community. Please pray for us to be sharing life with each other well.

6) Please continue to pray for a thirst for God's Word in us and around us - that God will be at work in the people of Somerset and the Burnie area and that the seed sown through us will find fertile soil and the Spirit already at work.

Also, please pray for myself and for my family as I attend the EFAC National Conference in Melbourne before heading with my family to the BCA conference in Launceston.

Points of Connection - Mark Driscoll in Australia

I was going to blog this in my other blog which has a lot of stuff regarding my personal thoughts and cogitations about things. But after watching this interview of Mark Driscoll in Sydney (at sydneyanglicans.net) there was so many points of connection with Connections that it seemed best to put it here.


All of our leadership at Connections have read Mark Driscoll's Confessions of a Reformission Rev which describes the planting and growth of Mars Hill Church in Seattle, USA. I have found much wisdom, encouragement, and inspiration in Driscoll's attitude to church planting. I gel with his reformed theology, his healthy grasp of the Spirit at work, and his method of cultural engagement. There is much from Mars Hill that can and has shaped Connections. Personally, I know that I have become more direct, perhaps even bolder, in my preaching. The sheer importance of getting to know Jesus, worshipping him and serving him 24/7 becomes our point, purpose, and joy.

And so when the interviewer in this video talks about Sydney Anglicans being intrigued about his reformed theology but ability to culturally connect, I smile and I cheer.

And some of the questions that readers of the sydneyanglicans.net website that are raised on the video are interesting. Again, there are echoes of the small-scale issues and principles that we face at Connections. Paraphrasing, examples are:
Q) How do you prevent issues of being a personality cult?
A) The leader needs to give up power - not too quickly, but slowly to qualified people. It takes the stress off of me...

Q) Isn't secular music worldly?
A) It's using culture in a redemptive way.... Our goal is to redeem as much as we can...

Q) You do communion every week - see it as commanded by Jesus - how do you stop it slipping into repetitive formula?
A) It follows preaching - think, pray, work it through - and you come when you're ready, not row-by-row. The key is to give people time and put it after the sermon so that it's in response to the Bible.

Q) How do you balance the cultural application when at times it can be seen as watering down the gospel?
A) The gospel will always win - we are to be faithful and fruitful, but if you have to pick one - faithful. I'm going to do everything I can to reach as many as I can - we see Paul say that.... The key is to never forsake the gospel.

Q) Dreams, vision, prophecies - do you still experience this? How do you interpret them as God-given or something else?
A) Most of the time when it's a prophetic dream, God gives me Scripture and then wakes me up. And usually its something in the future which happens as he says. Other times I see things, the gift of discernment... God telling me something I need to know so that I can serve people.

Q) What are the main things you can impart to us (Australians)?
A) I find Sydney to be one of the most selfish cities I've seen in my whole life - everything is about personal happiness. [With reference to Connect '09 a bit gospel initiative for Sydney next year:] That's all going to be relationship - reaching young men in cities is key to everything. The only way to reach them is through relationships... Strategic friendships with young men in cities.

Sunday, August 17, 2008

Second Logoff

I hate to be the bearer of bad news, but if you haven't figured it out already then we're about to tell you.

Second Life is broken.

I'm not entirely certain at what point this dawned on me personally, but when it did I immediately wondered why so many people tolerate this sort of thing. In any other industry, any software package that has a high failure rate is simply not released, and if it is released to the public then those responsible end up being fired.

It is my understanding that Second Life has been around for about 10 years now (1999 - Present) and yet after that much time in development and fixing, the system is still highly prone to complete blackouts and rolling restarts. When you find it a craps shoot to log in, that should be your first cue that something may have run afoul in this Metaverse.

Missing inventory items? Everyone waits patiently for Linden Labs to correct the problem. Issues with you inventory disappearing? Wait contently while Linden Labs fixes it, and so on.

The point here is, Second Life is poorly designed as a framework, especially one that is being open sourced. It has exactly zero capacity to upwardly scale and suffers from computer killing trafic, How often would you tolerate anything else in your life breaking as much as Seconf Life does?
As an american, I can tell you the answer should be none. Yet everyday sims proclaim how great the Lindens are for dealing with these corporate entities while completely sidestepping the issues at hand and blindly forgiving the massive faults.

If you though I was bad for tearing ActiveWorlds Inc apart publicly, then trust me, compared to Second Life, Active Worlds is the respected elder. Second Life, bluntly stated, has holes in it the size of the Grand Canyon, ones that avatars walk through every day just to log in. And yet people say "How nice of them to fix this issue so quickly."

Here is a list of things that should not, under any circumstances, be an outstanding issue when you've been in public use for nearly ten years:

1. Logging in. Seriously... there's no reason that the central database is down and most if not all citizens experience difficulty to get in to the world.

2. Inventory items should not simply "disappear". This is a no brainer, and yet again linked to SL's serious flaws when designing this - those being: Database Failures and Hard Drive Failures. On a weekly basis? I mean, are you kidding me?

3. Yes, I understand that you can do lots of great stuff with LSL (Scripting) but did you really feel the need to make a building interface which is conducive to building? Why does an amateur scripter have to create a Particle Generator HUD, when Linden Labs has over 100 programmers working on their staff?

4. What, exactly, is Havok 4 physics good for if you incorporate it into the system and 90% of the functionality is missing? Where are the cloth physics which is standard for physics engines? Flexi-Prims are a very poor example for using Havok 4, considering it would natively allow cloth physics which leads us to being able to set an object as a material

5. 10 Second Wav Files in PCM 44.1 KHz. Is Linden Labs even vaguely aware that there are thousands of audio and video formats in the world? And taking this a step farther, let's ask them another question (and this time the same rules apply). Why even limit the filesize to 10 seconds... it seems like a horrible joke on the content creating public.

6. Windlight can be summed up in about 6 files, at least in the capacity that Second Life makes use of them. These files are called Shaders. These shaders, if implemented correctly, can create stunning visuals and make an environment more immersive. When implemented incorrectly, you end up with Second Life Windlight edition. This is the edition where, when you enable all shaders, your computer comes to a near standstill if not entirely crashing.

7. Again with Windlight, but this needed to be said - The clouds feature under "atmospheric shaders" is a fancy name for "Draw a 3D Perlin Noise animation on the sky dome in 2D, while using every ounce of your computer to do it". Maybe you can run that section just fine, and maybe you have a great video card as well - but the idea here is that such a simple inclusion should not bring a computer to its knees for all but the high end users.

8. 2 is company, 20 is a crowd, 200 is impossible. Let's talk a moment about scaling issues here. You and one other person in a sim runs fine (as fine as SL runs for you), when 20 people show up, you're facing a very high probability that you and many others will crash. 200 people in the same spot and the sim needs to be restarted. I don't think it matters if you have 50,000 people online simultaneously... what matters is that they are spread out over a ridiculously large map, and if they ever showed up to the same place, they would crash. This alone I feel is mind boggling, considering that SecondLife (and ultimately Linden Labs) is making an effort for rapid growth of users (so they say).

9. I'll be the first to say this: I miss the Active Worlds way of creating Zones. In Second Life, this is woefully under powered in the form of Parcelling.

10. Prim limits and no way to permanently join prims in order to consolidate space. A large part of lag in Second Life is due to the shear amount of individual objects need to be loaded. No different from simply treating every object in Active Worlds as a singular object for building, and then removing the option to use premade models in the builds.

I have many more things I've mused about over the past few weeks, but I'll save everyone the hassles. Just be content that we are indeed learning a lot for participating in Second Life, and are writing them on the whiteboard.

That is not to say that I will not use Second Life (at least until Andromeda is ready), I am just not personally impressed at the shear lack of common sense in their development choices, otherwise known as the Second Life Viewer and Server.

In Other News

Queller is back! Woo. No need to withhold your posting and community stuff anymore, as the man himself has returned.

Blacking Out At The Keyboard From Lack Of Sleep -

William Burns
Project Leader
www.Andromeda3d.com


Saturday, August 16, 2008

The Three "Selfs" #1 - Self-financing

It is a long-held principle for guiding the planting of new congregations and churches that the ultimate aim is that the new body be characterised by the the "three self's" - self-governing, self-financing, self-reproducing.

In earlier stages of the Connections project we have put a lot of work into governance issues - particular when it came to our relationship with our immediate context of the Parish of Burnie. This work will need to be tweaked over time.

Our vision includes elements of "self-reproduction" - which ultimately would include sending out a group to plant a church from amongst us somewhere else. But in the meantime we are committed to reproducing some of the blessings that God has given us and taking our place in contributing to the wider church through whatever opportunity arises.

I've called this article "The Three Selfs #1" because I hope that at some point I'll write about the self-governing and self-reproducing aspects.

However, in this entry I want to reflect a little on the self-financing aspect of this principle. The Parish recently had it's Annual Meeting and for the first time a clear demarcated budget for the Connections Process has been adopted and approved. We've always had self-sufficiency financially on our list of aspirations - but now we need to actually have a plan to achieve it.

Underneath the self-financing principle lies a necessary vision for growth. We have to plan for growth and, therefore, to be basing certain decisions and projections on things that don't yet exist!

The way it works for us is this. At the moment the Connections project is subsidised to the tune of what it costs to provide one paid clergyperson. Half of this money is provided through the Diocese by Bush Church Aid (BCA) and the other half comes from the contribution of the Parish of Burnie. The rest of the budget (which in the coming year needs to incorporate amounts to pay for youth ministry and to support the ongoing children's ministry) comes from general giving at Connections - either in donations, regular electronic transfers, or "in the plate." Our giving is reasonable - about $18 per adult per week (although it could be better) - and covers these expenses and balances the budget.

But it is unhealthy for us to depend on the subsidy from Parish and from BCA. To be self-sustaining, and ultimately a net-giver to the Kingdom we need to budget for a phase out of this subsidy. We are projecting a phase out of around 25% of the original subsidy per year over the next few years. So subsidy will go down and costs will necessarily go up as more ministry is done.

And so we plan, believe, pray for growth. We are planning for an increase in the number of adults. We have to act on that plan by removing the obstacles to that growth - things like the size of our current venue, and our lack of efficient structures for incorporating newcomers. We can see what we need to do.

The growth needed in the next twelve months is for about another 5-10 members or families. This is the grace of a target that looks reachable but is only available as a gift from God. Pray that we will get there - for the faith that's needed for subsequent years will be greater still.

Some are uneasy talking about money matters and church. I count myself among them often. But in the end we aren't talking about riches here, we're talking about corporate maturity and growth as a gospel-organisation. We wish to be blessed to be a blessing, to give and to contribute. And so we must take the risky, difficulty, scarey path of growth.

Why I hate weddings

I had a long post planned, but it boils down to this:

A reading was given at the wedding I attended today about how the Bible sees marriage. Apparently the main thing it has to say (according to this reading) is that if there is an argument between husband and wife, and they cannot reach an agreement, then the wife must submit to her husband.

As an Equality and Diversity Adviser I was appalled. But it was my friends wedding, and I had to bite my tongue and seethe.

Aside: As a grumpy singleton who keeps getting wedding invites, I'm considering buying a ruined wedding dress and some toy spiders so I can attend as Miss Havisham

Monday, August 4, 2008

Bitchy rant

Well it turns out my ex isn't a useless cow, she's actually a lying bitch.

She was the one wanting to get back with me, but her track record of unreliability has been a major barrier.

I decided to give her a second chance and invited her to join me at a friends wedding. She was very excited.

And then, mere days before, she cancelled saying she had to work at short notice.

For most people this would be unfortunate but believable, but given her history, I was less than charitable and ignored her texts.

Sounds harsh maybe, but saying nothing was better than saying what was on my mind...


And then this morning I check Facebook, and see pictures of her out drinking with her mates on the night she was apparently working.

And people wonder why I am always so bitter and cynical...

Saturday, August 2, 2008

Voxel ray tracing vs polygon ray tracing

Carmack's thoughts about ray tracing:


I think that ray tracing in the classical sense, of analytically intersecting rays with conventionally defined geometry, whether they be triangle meshes or higher order primitives, I’m not really bullish on that taking over for primary rendering tasks which is essentially what Intel is pushing. But, I do think that there is a very strong possibility as we move towards next generation technologies for a ray tracing architecture that uses a specific data structure, rather than just taking triangles like everybody uses and tracing rays against them and being really, really expensive. It
involves ray tracing into a sparse voxel octree which is essentially a geometric evolution of the mega-texture technologies that we’re doing today for uniquely texturing entire worlds. It’s clear that what we want to do in the following generation is have unique geometry down to the equivalent of the texel across everything.

There are some interesting things to note in there:

- ray tracing in the classical sense, in which rays intersect with triangles, is far too expensive for use in games, even with next generation hardware

- the sparse voxel octree format permits unique geometry


Octrees can be used to accelerate ray tracing and store geometry in a compressed format at the same time.

Quote from a game developer (Rare) on the voxel octree:


Storing data in an octree is far more efficient than storing it using textures and polygons (it's basically free compression for both geometry and texture data). It's primarily cool because you stop traversing when the size of the pixel is larger than the projected cell, so you don't even need to have all your data in memory, but can stream it in on demand. This means that the amount of data truly is unlimited, or at least the limits are with the artists producing it. You only need a fixed amount of voxels loaded to view a scene, and that doesn't change regardless of how big the scene is. The number of voxels required is proportional to the number of pixels on the screen. This is true regardless of how much data you're rendering! This is not true for rasterization unless you have some magical per-pixel visibility and LOD scheme to cut down the number of pixels and vertices to process, which is impossible to achieve in practice. Plus ray casting automatically gives you exact information on what geometry needs to be loaded in from disk, so it's a "perfect" streaming system,
wheras with rasterization it would be very difficult to incrementally load a scene depending on what's visible (because you need to load the scene before you know what's visible!)
If you want to model micrometer detail, go ahead, it won't be loaded into memory until someone zooms in close enough to see it. Voxels that are not intersected can be thrown out of memory. Of course you would keep some sort of cache and throw things out on a least recently used basis, but since it's hierarchical you can just load in new levels in the hierarchy only when you hit them.


Voxels have some very interesting benefits compared to polygons:

- It's a volumetric representation, so you can model very fine details and bumps, without the need for bump mapping. Particle effects like smoke, fire and foam can be efficiently rendered without using hacks. Voxels are also being used by some big Hollywood special effects studio's to render hair, fur and grass.

- id wants to use voxels to render everything static with real geometry without using normal maps.

- Voxels can store a color and a normal. For the renderer, textures and geometry are essentially the same.

- The position of the voxel is defined implicitely by the structure that is holding it (the octree). Here's the good part: this structure represents both the primitives that need to be intersected and the spatial division of these primitives. So, in contrast to triangle ray tracing which needs a separate spatial division structure (kd-tree, BVH, ...), voxels are right away structured in a grid or an octree (this does not mean that other structures can't be used as well). So for voxel ray tracing, octrees are perfect.

- Voxels are very cheap primitives to intersect, much cheaper than triangles. This is probably their biggest benefit when choosing between voxel and polygon ray tracing.

- A voxel octree permits a very natural multiresolution. There's no need to go deeper into the octree when the size of a pixel is larger than the underlying cell, so you don't have to display detail if it's not necessary and you don't streal in data that isn't visible either way.

- Voxels are extremely well suited for local effects (voxel ray casting). In contrast to triangle rasterization, there are no problems with transparency, refraction, ... There are also major benefits artwise: because voxels are volumetric, you can achieve effects like erosion, aging materials, wear and tear by simply changing the iso value.

- Ray casting voxels is much less sensitive to scene complexity than triangles

(partly translated from http://forum.canardplus.com/showpost.php?p=1257790&postcount=96)


Disadvantages of voxel ray tracing vs polygon ray tracing:

- Memory. Voxel data sets are huge relative to polygon data. But this doesn't have to be a problem, since all data can be streamed in. This does however create new challenges when the point of view changes rapidly and a lot of new data bricks have to be streamed in at once. Voxels sets have the benefit over polygons that voxel subsets can be loaded in, which permits some sort of progressive refinement. Other possible solutions are: using faster hard disks or solid state drives to accelerate the streaming, limiting depth traversal during fast camera movement or masking the streaming with motion blur or depth of field postprocessing.

- Animation of voxels requires specialized tools

- Disadvantages of ray tracing in general: dynamic objects require the octree to be updated in realtime. However, there are solutions for dynamic objects which don't require updating of the octree (such as building a deformation lattice around dynamic objects so that when you raycast into it bend the rays as it hits the deformation lattice). id Tech 6 plans to tackle the problem of having many dynamic objects with hybrid rendering.

More on dynamic raytracing:

Dynamic Acceleration Structures for Interactive Ray Tracing, Reinhard, E., Smits, B., and Hansen, C., in Proc. Eurographics Workshop on Rendering, pp. 299-306, June 2000. Summary: This system uses a grid data structure, allowing dynamic objects to be easily inserted or removed. The grid is tiled in space (i.e. it wraps around) to avoid problems with fixed boundaries. They also implement a hiearchical grid with data in both internal and leaf nodes; objects are inserted into the optimal level.

Towards Rapid Reconstruction for Animated Ray Tracing, Lext and Akenine-Moller, Eurographics 2001. Summary: Each rigid dynamic object gets its own grid acceleration structure, and rays are transformed into this local coordinate system. Surprisingly, they show that this scheme is not a big win for simple scenes, because in simple scenes it is possible to completely rebuild the grid each frame using only about a quarter of the runtime. But, this would probably not be true for a k-d or BSP tree.

Distributed Interactive Ray Tracing of Dynamic Scenes, Wald, Benthin, and Slusallek, Proc. IEEE Symp. on Parallel and Large-Data Visualization and Graphics (PVG), 2003. Summary: This system uses ray transformation (into object coordinate system) for rigid movement, and BSP rebuild for unstructured movement. A top-level BSP tree is rebuilt every frame to hold bounding volumes for the moving objects. Performance is still an issue for unstructured movement.

Interactive Space Deformation with Hardware Assisted Rendering, IEEE Computer Graphics and Applications, Vol 17, no 6, 1997, pp. 66-77. Summary: Instead of deforming objects directly, this system deforms the space in which they reside (using 1-to-1 deformations). During raytracing, the rays are deformed into the object space instead of deforming the objects into the ray space. However, the resulting deformed rays are no longer straight, so they must be discretized into short line segments to perform the actual ray-object intersection tests.


Ray casting free-form deformed-volume objects, Haixin Chen, Jürgen Hesser, Reinhard Männer A collection of techniques is developed in this paper for ray casting free-form deformed-volume objects with high quality and efficiency. The known inverse ray deformation approach is combined with free-form deformation to bend the rays to the opposite direction of the deformation, producing an image of the deformed volume without generating a really deformed intermediate volume. The local curvature is estimated and used for the adaptive selection of the length of polyline segments, which approximate the inversely deformed ray trajectories; thus longer polyline segments can be automatically selected in regions with small curvature, reducing deformation calculation without losing the spatial continuity of the simulated deformation. We developed an efficient method for the estimation of the local deformation function. The Jacobian of the local deformation function is used for adjustments of the opacity values and normal vectors computed from the original volume, guaranteeing that the deformed spatial structures are correctly rendered. The popular ray casting acceleration techniques, like early ray termination and space leaping, are incorporated into the deformation procedure, providing a speed-up factor of 2.34-6.56 compared to the non-optimized case.



More info on id Tech 6 and voxel ray casting in the ompf thread

Carmack, id Tech 6, hybrid rendering

In his QuakeCon keynote, John Carmack explained that his next generation engine id Tech 6 would still be mainly a hybrid renderer:


I can say with conviction at this point that the next generation games are still going to be predominantly polygon games. Even what we're looking at for id Tech 6 with all of this infinite geometry, voxelising everything, probably recursive automatic geometry generation - all of this is still going to be a hybrid approach. We hope that we can generate these incredible lush environments on there, but the characters are probably still going to be coming in as triangles over a skeleton there. There will probably be some interesting things tried with completely non-polygonal renderers, but the practical approach with games that look like the games we're doing now, but play better, probably will still have lots of polygons going on and these chips better be really good at that.



Full voxel based games with many dynamic objects and characters are still too demanding for next generation technology. On the other hand, a scene with few dynamic objects (such as the Ruby demo) should be entirely possible with voxels only.

Friday, August 1, 2008

Ruby, voxels and ray tracing

Intro

My name is Ray Tracey. I'm a graphics enthusiast with a passion for lifelike interactive graphics.

Recently, I was struck by a realtime demo of a photorealistic city scene that was shown at an AMD/ATI event. It looked like something that came straight out of my imagination. For years I have been wondering when interactive graphics would reach this level of quality. I was interested to say the least.



A week later, AMD revealed that this city scene was part of their new Ruby demo for the Radeon 4800 cards: they showed the same scene, but this time there was moving traffic and people. Ruby appeared, running for her life from a giant killer machine. Once again, I was in awe: it looked unbelievable, I didn't expect to see such graphics within the next three years on consumer hardware. So after I witnessed this graphical marvel, I decided to find out more about it. After several hours of Google'ing, I had come to the disheartening conclusion that there was almost no info on the demo to be found, except for some lousy quality youtube video's and a small PR paragraph on the AMD site. But I searched further... The small snippets of info that I did find, stimulated me to write this blog as a "resource" that bundles all the publicly available information on this technology and as a means for better understanding of the tech to anyone interested.

The Ruby demo

To begin with... a picture says more than a thousand words




This is a picture from the Ruby demo. It looks photorealistic, but is realtime and interactive.


A high quality video (720p) of the animation can be seen here

Low quality video here


The Ruby demo was made by a company named Otoy. Jules Urbach, founder and CEO of Otoy, has given some info on the technology behind the Ruby demo in several video's on the net:

Video 1: http://www.youtube.com/watch?v=ROAJMfeRGD4&feature=related

In this video, Urbach says that the Ruby demo is rendered with voxel ray tracing. Otoy can also dynamically relight the scene.

The full transcript of the video:

Rick Bergman: So Jules, this is his creation. He's done a fantastic job with it. You're probably also thinking, well this is a video. He's gonna step you through and actually talk you some of the key features of this demo.


Jules Urbach: Thank you, Rick. What you're seeing here, is a frame from the animation you just saw, done Cinema 2.0 style. So the first thing you'll notice is that this isn't really just a video, we can look around, we can see the set that we've built, in fact it is, it's a set, you can see it's really ... When we first showed the clips of
what we were doing with this, some people thought the street scene was film, and it's not, it's a completely computer generated scene, created by our art team. And you can see here, this is the relighting portion of the rendering pipeline,this is really just a very early teaser, a preview of what we're doing with this Ruby demo.


So you're seeing only the second half of the Cinema 2.0 rendering pipeline, the
relighting portion of it. I can drag the cursor over any object and I can sort of see the different layers that go into making it whether it's global illumination, photon maps, diffuse lighting or in this case, complete control over the scene and the reflections.


And this is a really novel way of rendering graphics, we're not using any polygons. And the thing that makes this very different from just a simple relighting demo, is that every single pixel you're seeing in the scene has depth and it's essentially renderable as voxels.


We also have the capability of controlling every aspect of the exposure in the lighting pipeline, adding glares and glints to our satisfaction. And that makes a big impact in the rendering of any scene that we're doing.


So one of the things that is key to doing voxel based rendering is ray tracing, which I spoke about earlier, and the other element is compression, because these data sets are enormous. One of the things that's very exciting about the latest generation of hardware, coming from AMD, is that we can now write general purpose code, using CAL, that does wavelet compression. So we're able to compress these data sets, which are pretty massive, down to very reasonable components. And we think that we can stream those down and essentially give people, who have ever seen a video stream of that animation, essentially a fully navigable, relightable, completely interactive scene and that's the ... of 2.0 and we're very excited to be able to be part of that technology and that processing and bringing that to fruition.

The first time I saw this video, the words "ray tracing" and "voxels" immediately grabbed my attention. So I did some further research...

The Ruby demo continued

One week after the unveiling of the Ruby teaser, Jules Urbach recorded a new video, in which he gave a bit more info on the voxel based rendering that was used in the Ruby demo and announced his other company, LightStage LLC. LightStage is a technology used by Hollywood film studio's to capture lifelike 3D representations of actors, which can be relighted afterwards. It has been used in Spiderman 3 and will also be used to motion capture and digitalize the actress playing Ruby. Her digital facsimile will then replace the Ruby character from the teaser.

Video: http://youtube.com/watch?v=Bz7AukqqaDQ

Full transcript:

Hi, I’m Jules Urbach and this is a follow-up to the presentation we did a week ago at Cinema 2.0, the launch for the 770. What we are showing is a couple of new things that we weren’t able to talk about last week, that I think are really interesting. So we’re announcing today that we are developing LightStage. LightStage LLC is a separate company for capturing and rendering really high quality 3d characters and LightStage is an interesting technology that was developed at USC by Paul Debevec and Tim Hawkins, and it solves the problem of the uncanny valley as far as characters go. So we’re very pleased we’re able to announce that we can actually take this data and start working with it and applying it in our projects.

So, if you take a look at what we were doing years ago, to do characters and animation it was limited by the fact that our artists can only create so much detail in a head or a human form. And this is the normal map for that head and this is a really complex skin shader that we wrote to try to recreate what humans look like. And in a lot of ways both these problems go away with the LightStage.

The LightStage first of all can capture a real person. So, to generate this head, an artist doesn’t even sit there and sculpt it. We can essentially put a woman in the LightStage, which is a domed capture environment, and it captures all the surfaces and all the normals for it and all the details, including the hair. And it captures it in full motion, so you have to understand that, unlike traditional motion capture where there is either make-up applied to the face or dots put on the face, we just put somebody in the LightStage and they can do their lines and speak and it does full motion capture optically.

So it gets the full data set of essentially all the points in their face. And this is in fact the rendered version of LightStage. This is all the data that is captured on the LightStage accumulated. It’s not a photograph. This is a fully relit head, based on the model you just saw. And it’s obviously a lot of data, but the work that we announced last week with the GPU, compression / decompression, we’re going to apply that to LightStage (and have) data sets that we can start loading in real-time and rendering them, not just for film work but in games as well.

So the LightStage data, I think it really closes the uncanny valley. I mean particularly this kind of data where we have all the relighting information, stored for every single pixel, it’s exciting and it gives us really high quality characters that I think look completely real. And that’s, I think one of the things we showed last week at the Cinema 2.0 event was that we could do scenes and cities and things that look really good and this essentially gives us people. And it goes even further than that, but we will certainly be announcing more as we get further along working on LightStage.

I’m gonna show one more thing that we didn’t get a chance to really show last week as well, which is some of the real-time stuff, the voxel rendering. We basically had two separate demos, one for just showing the fact that we can look around the
voxelized scene and render it. And this one now, this demo, is a slight update of the original one, where I’m able to actually look around and essentially place voxels in
the scene, but also relight it as well.

So this is part two of three that we’re releasing. This shows essentially us going through the scene, selecting an object and either rendering it through the full lighting pipeline or just doing the global illumination pass. Then we can also just use the normals to do full, totally new novel reflections on. But you can essentially see that the voxels, even in this fairly low resolution form, can capture lighting information and capture all these different details. And we can do that as we’re navigating through the scene.

So this is sort of part two of our Ruby real-time demo and part three of it is gonna show, as a next step, the full navigation through the voxel scenes. And that’s gonna be dependent on the compression we’re developing. Because right now, the reason why we are not loading the entire animation is that the frame data is about 700 megabytes for every frame. We can easily compress that down to 1/100th the size, we're looking to do about a 1000th the size. And then with that we’ll be able to load much larger voxel data sets and actually have you navigating pretty far throughout the scene and still keep the ray tracing and voxelization good enough that you don’t really see any sort of pixelized or voxelized data sets too closely.

So one more demo, that’s worth showing I think, is related to the LightStage. You can actually see that on the right here. And that basically is really a mesh that is generated from one person in LightStage. It doesn’t have all the LightStage data in there, but what you can see from this example is one reason why voxel rendering may be important. So this is really using a very simple polygonal mesh, evenso it’s about 32 million triangles, just to render the scene. So I’m gonna show the wireframe of it and you can see that the data is so dense, everything from the eyes to the eyelashes are all there, and we’re only really using a small subset of the point cloud that’s generated from the LightStage. So if we move to voxel rendering, which I’m planning to do for LightStage as soon as we’re done with the Ruby demo, we’ll be able to have voxelized assets rendering in realtime at mùch higher resolutions than this. And that’s gonna be giving us characters that look better than anything we can show in any of these videos. And we should have that ready probably before the end of the year, so it’s exciting stuff! Thank you for watching, hope you enjoyed the
demos.

Otoy, Transformers and Ray Tracing

After the Ruby/LightStage demo, 4 other video's appeared as part of an article about Otoy on TechCrunch. Urbach explains that he started experimenting with Renderman code on graphics hardware during the making of Cars in 2005. This work caught the interest from ILM, who gave Urbach the models from the Transformer movie to render in realtime. Urbach and his team made 4 commercials for the Transformer movie that were rendered and directed in realtime on graphics hardware. Afterwards, he was contacted by Sony to work on the Spiderman movie.



The 4 video's:


Video 1 OTOY Demo

This video shows short clips of realtime rendered Transformer sequences



Video 2 Jules Urbach explains OTOY's real-time graphics rendering

In this video, Urbach talks about his experiments with GPU ray tracing in 2005, the Transformers trailers and the voxel raytracing for the Ruby demo. For the tests with Cars in 2005, he was able to do "realtime raytraced reflections with up to 20 bounces of light". He also implemented some realtime global illumination technique. For the new Ruby demo, he is actually "raytracing the entire scene", and "not using the vertex pipeline anymore". Thanks to the voxel rendering "the level of detail becomes infinite".



Video 3 OTOY Graphics Rendered in the Browser

This video shows the server side rendering capabilities of Otoy. It shows Urbach interacting with scenes from the Transformers trailers, that are being rendered in realtime on his GPU servers and streamed over the net into the browser.

Urbach mentions "raytraced reflections on the windows". When he switches to nighttime, he says "in this particular demo, there's no baked lighting, nothing is precomputed", there are "hundreds of lights in the building rendered in realtime".

The demo runs on three graphics cards (3x RV770): one card renders the ILM Optimus, second card renders the G1 Optimus Prime and the third card renders the city and the raytraced reflections on the windows.



Video 4 Jules Urbach of OTOY Explains LightStage

Video about LightStage, slightly more elaborate than this one.



There is also a video of the full AMD Cinema 2.0 event in which Urbach talks a bit about ray tracing on GPU's (from 41:00 to 47:00) and goes a bit more in-depth during the Q&A session (from 72:00 to 88:00):



- Urbach has been talking to game publishers to start integrating the relighting part of Otoy in existing game engines

- Otoy can do full raytracing, but also supports hybrid rendering. It can convert any polygonal mesh to voxels

- The Ruby demo does not use any polygons, only voxels

- For games, Urbach thinks hybrid rendering will be the way to go "for a very long time"

- With this technology, game developers will require a different way of working. Basically they're saying that you can make a photorealistic game, but the workload on the artist side will be astronomous

- In 2005, Urbach started out writing approximations to Renderman code during the making of Cars. At the time, he used cheats for ray tracing and reflections. In three years, GPU’s have evolved so quickly that the latest hardware makes realtime ray tracing possible that is “99 % accurate”

- Voxel data sets are huge, but with voxel based rendering you can load only subsets of the voxel space, which is not possible with polygons. You can also choose which texture layers to load

- Compression and decompression of the voxel data is CPU bound. What takes 3 seconds to decompress on a CPU, can be done at a “thousand frames per second” on a GPU.

- What's interesting according to Urbach is that in 2005 he started out writing approximations to ray tracing, but the latest generation of hardware allows him to do ray tracing that gets really close to the 100% point





Urbach also showed another Otoy demo at the AMD event, called Bug Snuff. It shows a photorealistic scene with a scorpion, rendered in realtime and directed by David Fincher. Really impressive stuff!






Lastly, the ompf thread where it all started: http://ompf.org/forum/viewtopic.php?f=6&t=882

Thanks to all the ompf members and guests who participated and contributed to the thread.

id, Voxels and Ray Tracing

According to this article, the full Ruby demo will be shown to the public at Siggraph 2008.

My interest in this demo is, apart from the photorealistic quality, based on two facts: the GPU ray tracing and the voxel based rendering. Never before have I seen a raytraced (CPU or GPU) scene of this scope and quality in realtime. Urbach has stated in the video's that his raytracing algorithm is not 100% fully accurate, but nevertheless I think it looks absolutely amazing.
At Siggraph 2008, there will be a panel discussion on realtime ray tracing, where Jules Urbach will be the special guest. Hopefully, there will be more info on the ray tracing part then.


On to the voxels...
In March of this year, John Carmack stated in an interview that he was investigating a new rendering technique for his next generation engine (id Tech 6), which involves raycasting into a sparse voxel octree. This has spurred renewed interest in voxel rendering and parallels with the new Ruby demo are quickly drawn.

Today's GPU are already blazingly fast when it comes to polygon rendering and don't break a sweat in the multimillion triangle scenes of Crysis. So there must be a good reason why some developers are spending time and energy on voxel rendering. John Carmack explains it like this in the interview:


It’s interesting that if you look at representing this data in this particular sparse voxel octree format it winds up even being a more efficient way to store the 2D data as well as the 3D geometry data, because you don’t have packing and bordering issues. So we have incredibly high numbers; billions of triangles of data that you store in a very efficient manner. Now what is different about this versus a conventional ray tracing architecture is that it is a specialized data structure that you can ray trace into quite efficiently and that data structure brings you some significant benefits that you wouldn’t get from a triangular structure. It would be 50 or 100 times more data if you stored it out in a triangular mesh, which you couldn’t actually do in practice.

Jon Olick, programmer at id Software, provided some interesting details about the sparse voxel octree raycasting in this ompf thread. He will also give a talk on the subject at Siggraph.

In the ompf thread, there are also a number of interesting links to research papers about voxel octree raycasting:

A single-pass GPU ray casting framework for interactive out-of-core rendering of massive volumetric datasets Enrico Gobbetti, Fabio Marton, and José Antonio Iglesias Guitián 2008
http://www.crs4.it/vic/cgi-bin/bib-page.cgi?id=

Interactive Gigavoxels, Cyril Crassin, Fabrice Neyret, Sylvain Lefebvre 2008
http://artis.imag.fr/Publications/2008/CNL08/

Ray tracing into voxel compressed into an octree http://www.sci.utah.edu/~wald/Publications/2007///MROct/download//mroct.pdf

The octree texture Sylvain Lefebvre
http://lefebvre.sylvain.free.fr/octreetex/

The difference between id Tech 6 and Otoy is the way the voxels are rendered: id's sparse voxel octree tech is about voxel ray casting (primary rays only), while Otoy does voxel raytracing, which allows for raytraced reflections and possibly even raytraced shadows and photon mapping.