Quantcast
Channel: General Development – Blender Developers Blog
Viewing all 170 articles
Browse latest View live

User Interface Workshop – Feb. 2020

$
0
0

by William Reynish

The dust has now fully settled after the major UI changes that happened in Blender 2.80, this gives us a vantage point to look at planning Blender’s forthcoming next UI-related work. For this reason, a few Blender developers met at the Blender HQ in Amsterdam for a two-day workshop and planning-session. Among the group of developers were Ton Roosendaal, Dalai Felinto, Brecht van Lommel, Julian Eisel, Bastien Montagne, Pablo Vazquez and William Reynish.

Rather than going over every little detail that could be improved, we chose to focus on the major areas that will affect the user experience. Here is a summary of the meetings and sessions we had.

Asset Manager

What is an asset?

While the asset manager sounds simple, as you dig down into it, you start to realize that it deeply affects many areas of Blender. Questions like ‘what is an asset’ is not even necessarily straightforward to answer, nor are details such as exactly how are they stored and referenced.

We think of assets as an ID-type (Collection, Object, Mesh, Material, Texture, Node Group, etc) with a special ‘asset’ tag. With the asset, we can also store additional information, like a set of additional metadata tags, the author, rating and so on. 

Library and Projects

We discussed the issue of assets by talking about several main use-cases: 

  1. Library: There’s the use-case of using a central repository of assets for use as a starting point. This can either be built-in assets or a user-created asset library.
  2. Project: The second main use-case is for projects, such as movie. Here you typically link assets between blender files.

The two use-cases share some solutions but also have differences. 

For the Library use-case, we had to decide how assets are stored, since we want to present users with something that looks like a library of assets. We think the best solution is to store 1 asset per blend file, since then you can more easily share and delete assets from the file system. However, we would need to create a database cache so that browsing assets is fast and doesn’t require Blender to open each asset .blend to read the contents before you can see it in the browser. 

For the Project use-case, we have the core issue that currently Blender has no real concept of a project to begin with. If you use the example of an open movie, you would want to link assets together within each project, but not between projects. And when you open Project A, you would want to automatically be presented with relevant Project A assets – not things that are meant for Project B. For this reason (and others) the first step is to add the concept of a project as a core concept in Blender. This is meant for any time you want to create something that spans multiple blend-files. This concept was also discussed with the Grease Pencil team for the storyboard/VSE project that is planned for the future.

UI Layout and Polish

The UI layer and front end to the asset manager is in many ways the easiest part to design (although not necessarily the easiest to code). We simply want to present users with a great browsing experience that is fast, searchable, and fun to use. Throwing on a wood material onto your model should be slick, interactive and feel tactile. Dragging furniture into a room should stick to the floor and not just float somewhere random in space. We agreed that we’d like to support these kinds of things, even though technically some of those are perhaps not necessarily strictly only for things coming from the asset browser in particular. 

Asset Manager UI example

Brushes

A big part of the UI workshop was centred around the brush system. Brushes are currently a pain to deal in Blender because they are stored inside each individual .blend file. This means that after users have spent lots of time setting up a nice set of custom brushes, they will be lost when you open a different .blend file, and the user must manually append the relevant brushes every time. That is the equivalent of having to load your brushes every time you would open a new image in an image editing or painting app – it’s not ideal. 

Instead, we would like to treat brushes more like we treat Material Preview HDRI’s – as global permanent items that live in your Blender library, and which can be used on any project. This sounds simple, but in practice it opens many questions. HDRI’s can’t be edited and also are very self-contained. Brushes can of course be edited at any time by the user, and can make use of things like textures that live in the scene. Making sense of this is actually potentially a bit complicated, but we think we have an idea of how it should work.

Brushes will be automatically loaded from the main Blender library. Any changes to them will stay local in the current Blender file. However, we would like to add a way to copy or override the brush for re-use later.

Brush selector

For more on assets, see https://developer.blender.org/T73366


Everything Nodes

One of the other big forthcoming changes this year will be the start of the conversion of Blender to become fully node-based. This not only has implications for Blender’s internals, but also the user interface. In particular, for this workshop, we focused on some concrete problems, as listed below.

Node Flow Types

Currently, in the Compositor and with Shader nodes, everything users see is what we call ‘data-flow’ nodes. Each node passes some data to the next node from left to right until we end up in a Compositor or Material Output node. At a conceptual level this is fairly easy to follow.

Because particle nodes is really more a logic-based node system than something like compositing or shading, the challenge for Everything Nodes and in particular the Particle Nodes system, is that we also are introducing two other types of connections that don’t represent data-flow at all: Control-Flow and Effectors. 

  1. Data-flow Nodes
    The nodes you already use in shaders and in the Compositor all represent a flow of data. Particle nodes will also include data-flow nodes.
  2. Control-flow Nodes
    These don’t represent data being passed from left to right at all. You can think of them is being a ‘condition gate’, and can essentially be read from right to left.
  3. List Nodes
    Not a flow, but simply a list. We use these for effectors like emitters, forces and events. They are essentially to be read as a list from top to bottom.
Node flow types

The challenge is to communicate the difference between these node types, their connections and sockets. We brainstormed many different kinds of solutions:

Node connection styles for communicating different types of connections
Different ways to show stacked nodes

Inputs & Outputs

The other issue that the new node trees expose is the much heavier use of group nodes as a core concept. This is important, especially in tandem with the asset manager, because it means users will be able to more easily re-use node groups, and also that we will ship lots of node group presets for higher level functionality. Group nodes allows us to get a layered interface, where we encapsulate complexity, but these can always be opened and modified if users desire to go a level deeper.

Different visual concepts for showing that something is an input

The apparent issue we have here, is how to better handle inputs and outputs from group nodes. Currently they look just like normal nodes, but really they are a very different category. We can think of these as coming from a different dimension in the node tree. 

Inputs from ‘outside’ the group

Multi-item Properties

Making it easier to edit and view properties across multiple selected items, has been a long standing desire to address. For this workshop, we wanted to look at this in detail to make sure the proposed design was acceptable.

We already had a design for this, but we decided that we would rather do this the more proper way, rather than try and solve it as a quick hack. Doing this well has many implications for how to treat selections vs active items, and which panels and properties can even be visible when multiple items are selected.

Multi-item properties was rethought to become a deeper, but ultimately more thorough, change

Read here for details: https://developer.blender.org/T54862


The items above are not a comprehensive list of all UI improvements to be done, nor all the work being contributed by the community, but it gives a peek into what some of the core developers are planning over the next period to focus on to further advance Blender’s user experience. 

William Reynish
Blender UI Team


Asset Manager

$
0
0

The asset manager project goal is to help artists to quickly re-use, share and organize assets and production files. This project is one of the 10 projects for 2020, starting with a clear and focused design, planned to be implemented and expanded over multiple releases.

What is an asset?

“An asset is a data-block with meaning.”

A .blend file is a database with multiple data-blocks: objects, textures, materials, … When planning to re-use or share them, the data needs a meaning. What is this? What is this for? Assets are curated data-blocks that are meant for use outside of their original .blend file.

The asset meaning is handled as meta-data (category, thumbnail, author, license, …). When linking or appending from a file, assets should be the primary filter option one sees.

The asset nature and origin will determine the type of the asset:

  • User Presets reusable assets that can be appended into any scene. For example a base mesh for sculpting, gold material, or fireworks particle simulation.
  • Project Assets data-blocks in a project that are intended to be consistent throughout the project. For example a character or prop common to multiple shots in a film.

Note that user interface presets – matcaps, brushes, HDR studiolights, keymaps – are not part of the initial target for the asset manager project.

What is a repository?

“A repository is a container of assets.”

A repository is a folder-like containing multiple .blend files with assets inside. There are 2 kinds of repositories:

  • Presets repositories contain exclusively user presets that are appended or applied to the scene.
  • Project repositories contain exclusively project assets that are mostly linked, but they can also be appended or applied to the scene.

Eventually a .blend itself can be treated as a full repository. However performance would be problematic for large files. So this idea is on hold until we have more use cases to justify its development.

Life cycle of an asset

“Create, edit, share and use it.”

The asset manager is a new editor responsible primarily to use the assets and navigate the repositories. However it can also facilitate the creation, editing or even sharing of assets. These quality-of-life functionalities can still be performed manually for ultimate control.

  • Creating assets can be done manually. This is as simple as appending data-blocks in single or multiple .blend files, and setting data-blocks as assets.
  • Editing an existing asset is possible by opening the .blend files directly from its repository.
  • Sharing assets works the same as sharing .blend files containing the asset data-blocks.
  • Using assets is done via a new editor – the asset manager – where you can simply drag assets from the enabled repositories into the scene.

Milestones

The project will be implemented in steps:

  1. Basic editor – Managing user presets and preset repositories in a new asset manager editor.
  2. Online repository – Navigate preset repositories hosted online and download the assets on-demand.
  3. Usability – Improve the editor overall usability and extended drag and drop features.
  4. Variations – Support for different variations of an asset (e.g., Sintel young and old). The initial implementation will only handle collections.
  5. Project repositories – The definition of a project and its related resources (repositories) and settings.

To follow the latest development checkout this task. For related topics read the write up of the UI workshop last February.

Tracker Curfew Wrap Up

$
0
0

Since the release of Blender 2.80 beta at the end of 2018, the development team has been dealing with an increasing number of issues reported on the bug tracker.

After the 2.80 release, a further increase of reports pushed the count of “open issues”, or “issues that need developers attention” close to 2000. This number used to be around 200-300.

2020 started with the mission of getting the issue tracker back in control, with the “tracker curfew” project. In just a few weeks, the development team managed to handle over 1300 open issues, while also dealing with the regular influx of new reports! In the meantime, Blender 2.82 was released and a long-term release strategy was announced. 

The next paragraphs summarize the issue management work done in the past months and what to expect in the future.

Backlog

A lot of the existing open issues were based on deprecated features, or simply not part of the current modules roadmaps. To clear the situation and find the good reports from the noise, the entire team of developers worked to re-triage the entire backlog of issues. That included both untriaged issues and issues that were already confirmed.

On February 18th, all the backlog was finished, leaving only confirmed relevant bugs in the picture. Part of this was achieved by fixing bugs, part by closing old reports. A numerous amount of them went from confirmed bugs to known issues.

Known Issues

A known issue is a confirmed report, which contains all the information required in order to be fixed, but that is not planned to be fixed in the next 6 months. Known issues are a way to document shortcomings, identify areas that need more developers, and to give an honest statement about issues that won’t be fixed in the foreseeable future.

As expected, plenty of issues could be categorized as known issues. Some of the them date back from 2013! After the backlog, the number of new reported known issues kept growing as expected.

The Trend

The original goal of bringing the number of open issues to 200 was too ambitious. And it can’t be reached with the current pace of development and the existing team.

The bug tracker is receiving an average of 30 new reports every day. The number of daily closed or new known issues is similar to that. That is enough to keep things under control, but not enough to tackle the existing bugs. For that, more bugs need to be fixed and reports classified into bugs or known issues.

All Hands on Deck

To wrap up the Tracker Curfew and deliver a more stable Blender 2.83 the entire development team will work for the next two weeks on a bugs fixing sprint.

This may have to be repeated regularly in the future to prevent unattended bugs from dragging too long. For now the focus is on making sure the remaining high priority issues are fixed with enough time for testing until the next release. And if other bugs are squashed in the process, all the better.

This effectively shifts the deadlines for 2.83 and 2.90 two weeks into the future, to reduce the impact in the development cycle of new features in 2.90.

Modules Roadmap – June 2020

Changes to the Alembic Exporter

$
0
0

This blog post describes the intended changes to the Alembic exporter in Blender 2.90. Here Sybren Stüvel talks about these changes, the motivation behind them, and where feedback from the community is welcome.

Recently I have been working on a new Alembic exporter, in such a way that both Universal Scene Description (USD) and Alembic can be exported using the same code base. This makes the export of Alembic and USD consistent, which comes with a few changes.

Transform: to inherit or not to inherit?

In Blender, an object always inherits the transform of its parent. In other words, the transform of an object is computed by taking the transform of the parent object, and then applying the object’s own transform on top of it. In Alembic, this is not the case. An object can be the child of another object, without inheriting its transform. This is a per-object setting that can be turned on or off.

When exporting to Alembic, Blender 2.83 turns on this “inherits transform” setting only when the object has a parent. All other transform are marked “non-inheriting”.

Unless there are strong objections to this change, Blender 2.90 will simply always mark transforms as inheriting. This is how Blender works, and to me it makes sense to reflect that in the written Alembic files. More importantly, the Alembic plugin for USD only supports inheriting transforms and ignores non-inheriting ones. This change would make the Alembic files that Blender writes compatible with a USD-based workflow as well.

Names of Meshes

Let’s say there is a mesh object named Suzanne, of which the mesh is named MonkeyMesh. Blender 2.83 ignores the mesh name, and writes it to Alembic as “/Suzanne/SuzanneShape“.

The USD exporter will simply use the datablock name when exporting that datablock. That means that it uses the object name for the object and the mesh name for the mesh, exporting to “/Suzanne/MonkeyMesh” in the above example.

This is what the Alembic exporter in Blender 2.90 will do as well. In those cases where it is important to maintain the old naming, this can be done by changing the mesh name to the object name with “Shape” appended.

Names of Particle Systems

Almost all the data that is written to Alembic is using the same naming scheme: it takes the name from Blender and replaces spaces, periods, and colons with undercores. Particle systems are exceptions here: in Blender 2.83 their names do not go through this replacement of characters.

By switching to the approach used by the USD exporter, the Alembic files will be using a consistent naming strategy. That means that all names will undergo the same replacement rules, and spaces, periods, and colons in particle system names will also be replaced with underscores.

Note that this change is about applying consistent naming rules. The replacement of spaces, periods, and colons was part of the initial patch that brought Alembic support to Blender. Unfortunately, I have not been able to find any public definition of what forms a valid name in Alembic.

The Alembic source code seems to allow pretty much anything as name, but as always, this not only depends on the file format but also on the capabilities of the software that reads the Alembic files. The Maya exporter seems to only handle colons as special case, but this is of course related to what other symbols Maya does (not) accept in names. Feedback on this, or information on how other DCCs like Houdini handle this, is welcome.

Blender could be made more liberal in the allowed names as well, and also allow spaces or periods in the name. If you have more knowledge about this, either from format specifications or from studio workflow experience, please let us know in a comment.

Object Instances

When objects are instanced into the scene, the output of Blender 2.83 is sometimes hard to predict, and not the prettiest either. For each instance of an object, Blender 2.83 tries to write it to the Alembic file with its own name, but when this name is already used by a different object, it simply appends an underscore. This means that you can then have “Suzanne“, “Suzanne_“, all the way up to “Suzanne_________” if you have ten instances. When these instances are created by a particle system, it becomes even harder to know what is going on.

The USD exporter, and thus also the Alembic exporter in Blender 2.90, numbers the instances. The above example would simply produce “Suzanne-0” up to “Suzanne-10“. This also supports nesting, so when an instanced object itself also instances objects, it could result in a name like “Suzanne-7-9“.

Dropped unofficial support for HDF5

The Alembic format supports two compression methods: HDF5 and Ogawa. Ogawa is the newer of the two, is faster to read and write, and also produces smaller files. Blender has never officially supported HDF5, but there was a build-time option to enable the format for import & export. This has never been enabled in any Blender release, though.

After a short discussion, it has been decided to remove the unofficial support for HDF5 as of Blender 2.90. Recent DCCs will very likely already write to the Ogawa format, and older files can be converted with the abcconvert tool.

Conclusion

All the above changes are aimed at better consistency in the exported files, and at giving artists and TDs more control. Please let me know in the comments what you think about these changes.

Bug Sprints

$
0
0

Building a better Blender

Bug Sprint Proposal – July 2020

This article proposes a new strategy to handle bug sprints during the Blender development cycles. The proposal was met with positive consensus by the Blender maintainers team and will be tried already for the 2.90 release.

Before 2019

bcon1: 2-3 weeks
bcon2: 3-4 weeks
bcon3: 2-3 weeks
bcon4: 1-2 weeks *

Master open - new features for 5-7 weeks
Master closed - bug fixes only for 3-5 weeks

* Release candidate: 1 week per RC - once where were 3 RC
Master closed for undefined periods. Average of 12 weeks release cycle. 3 to 4 releases a year.

Prior to the first 2.80 release, the team would go through the phases of the development cycle at the same time. This made bug sprints a natural stage of the development. A final polishing pass before a new Blender would be out.

However, the unpredictability of the release cycle duration combined with the “narrow” features window lead to many unfinished features rushed in the last minute. This in turn would lead to more bugs and delayed releases. In the end “release candidates” were great, but would also delay a release.

Current system 2019-2020

bcon1: 9 weeks
bcon2: 4 weeks
bcon3: 4 weeks
bcon4: 1 week

Master always open
Overlapping branches so master is always open. 18 weeks release cycle with a 5 weeks overlap. 4 releases a year (1 every 13 weeks).

With the 2.80 release, the overlapping release cycle allowed for master to be always open for new features (good), predictable release schedules (good), but also bugs going unfixed for too long (bad) or introduced at a fast pace.

Proposal for 2020 onwards

bcon1: 9 weeks
bcon2: 4 weeks
bcon3: 5 weeks
bcon4: 1 weeks

Master closed for bug fix only for 2 weeks in bcon2 and bcon3 respectively

Overlapping branches so master is almost always open. 19 weeks release cycle with a 6 weeks overlap. 4 releases a year (1 every 13 weeks).

Shuffling a few weeks around, we can accommodate two bug sprint weeks within a release cycle. The first one as part of the bcon2, to assess and address the critical issues at the backlog and prepare for the beta period. Another one after the beta is out, to handle the issues reported then. This is distant enough from bcon4 so that the reported issues can be fixed for the upcoming release (as opposed to master).

Both bug sprints are also instruments to bring the team together. Blender is a product that concerns everyone involved. Developers should be encouraged to reach out outside their modules/comfort zones and help out whenever is needed.

The bug sprints are also a way to balance the drive to innovate and add new features, with the professional need of stable and mature software. The more features Blender gets, the more potential bugs it will get. Planning for this team-wide is a way to have everyone helping Blender’s overall stability. Besides that, the development and the user community responds well to focused task-forces like this.

The extra week in the release cycle is just to guarantee that the (now) longer bcon3+bcon4 still overlaps with parts of bcon1.

Bug sprint – May 2020

Commits in origin master

Results of the previous bug sprint, 2.83, May 2020. The number of committed bug fixes doubled during the two week long bug sprint.

2 Bug sprints

First bug sprint in the middle of bcon2, to handle backlog.

Second bug sprint in the 2nd week of bcon3 to handle "beta" reports

Splashscreen changes on bcon3.

Once a new release is out, the number of bug reports doubles for a week or two. That lead to many corrective releases in the past. As an example, 2.83.1 got 33 bug fixes ported over, and there are more to come to 2.83.2. To bring back more testing at a specific time – when the new features are more stable (bcon3/beta) – should mitigate that.

To help promoting stronger testing during bcon3, Blender could also get the “2.90” splashscreen in the Beta builds. This should create buzz and drive users for testing. It also adds a healthy pressure to the developers to treat bcon3 as a state where the features should be considered finished, or postponed to the next release (i.e., hidden from the UI and moved to the experimental panels).

By spacing the bug-sprints in two helps developers to plan ahead and clear up their week (EVERYTHING can wait a week for a reply, review, meeting, …). That also prevents burn out by 2 week long bug sprints followed by fixing newly reported high priority beta bugs for bcon3 and bcon4.

There is always a bug sprint lurking around

Overlap of multiple releases in the same timeline

With the overlapping release cycles, there will always be a bug sprint not too far into the future. That makes for a quick assessment of some bugs:

  • Developers see it once, and decide on postponing it to the next bug sprint.
  • They see it again, and postpone it a second time.
  • By the third time they see it, it should be clear whether the bug won’t be fixed any soon (known issue), or should be scheduled in the development agenda.

Either way, it keep the backlog fresh, and the team engaged in a smart use of its time (keep things “agile”).

Blender 2.90

Final planning for 2.90 bug sprints.

Bug sprint week 13-17 July and 3-7 August.

This can start already for 2.90, following this proposed schedule.

Exception for a few developers

Alternative bug sprints planning for 2.90 for few developers, 20-31 July

A few developers prefer to work 2 weeks back-to-back instead. The development team can easily accommodate that in a case by case basis, having them working in the in-between bug sprint weeks.

That said, the team is still encouraged to join efforts at the same time for maximum integration of everyone involved and contributors.


If you have any thoughts or feedback, please share in the comment sections.

2020/21 Recruitment

$
0
0

To complete the team and to be able to further professionalize the Blender project, the following open (paid) job positions have been defined.

The information below is being shared to create transparency in communication. There will probably be questions about this and the process – please comment these here. Ton Roosendaal will reply swiftly.

Note that for none of these jobs formal calls for application have been made yet, recruiting will be done via blender.org eventually. Testing the waters here!

Development Infrastructure Engineer (DevOps)

  • Manage the blender.org infrastructure for developers (including Wiki, docs, blog, chat, etc)
  • Implement and manage automating tasks, such as testing and building.
  • Local job at Blender HQ Amsterdam.

Status: currently finalizing agreement with excellent candidate.

Development Community Coordinator

  • Coordinate and support online activities for every contributor to blender.org; volunteers and professionals alike.
  • Organize on-boarding for new developers.
  • Finding and connecting new talent within Blender projects.
  • Remote job. Preferably: someone in the North American timezone.

Blender UI Developer/Designer

  • Assist developers with visuals, mockups, tools and UI layouts.
  • Keep a close eye on UI and usability guidelines (consistancy, core principes, etc)
  • Preferably: someone with (basic) C and (decent) Python skills.
  • Availability to work at Blender HQ, Amsterdam.

Writer / blogger/ editor

  • Write content for blender.org and Blender Cloud.
  • Should be able to write release logs, articles in the Code blog, press releases and studio production logs.
  • Preferably: video editing skills!
  • Availability to work at Blender HQ, Amsterdam.
  • Part-time possible.

Back-end Web Developer

  • Someone who understands and loves 3D animation (and game) production – the tools and infrastructure needed for it.
  • Manage the Blender Cloud back-end for publishing content.
  • Develop online tools which will be used and shared by the Blender Studio for production.
  • Available to work at Blender HQ, Amsterdam

Blender Foundation/Institute board of directors

This is a 3-5 year target, to appoint a board of directors to build and maintain a sustainable, future proof organization for realizing the mission of Blender.

Recruitment for the operations role will start this year:

  • Operations/Finance director (Dutch resident & Dutch speaking)
  • Technical director (incl. managing industry relations)
  • Creative director (Blender design & UIX, branding, communication, demos/films)

Note: This is not a call for applications yet, it’s to mainly share transparency on what’s to come in the near future. Recruiting will be done via blender.org eventually.

Thanks!

Ton Roosendaal

Faster Motion Blur with Intel Embree

$
0
0

Motion blur is essential for realistic rendering, but it comes at a price; especially on scenes with fast-moving characters featuring hair or fur.

Blender 2.90 introduced support for Intel Embree, reducing Cycles render times up to 10 times on production scenes such as the Agent 327 benchmark file.

Agent 327 benchmark scene available on cloud.blender.org

Stefan Werner — the initial author of the Embree integration in Blender, shares how Embree made its way into Blender.

“When I started working for Tangent Animation in 2017, one of the first requests they had was improving motion blur performance. Sergey Sharybin’s work had already given it quite a boost, but it was still too slow. “

Stefan Werner

Stefan already had experience with Embree from his previous work at Poser, so he implemented an initial integration.

“Within a few weeks, we had the first tests running with production scenes and were pleased to see that we not only got 2-10x speed improvements, but also were saving significant amounts of memory. From there on, it was only fixing a few loose ends and we were happily using it on Next Gen.”

Stefan Werner

While Embree has been available in Blender as a build option for over a year, it was not part of the official releases due to some differences in render results with the existing hair intersection code in Cycles.

Brecht Van Lommel took the task of porting Embree’s hair intersection code to the rest of Cycles. Embree 3.7.0 — released at the beginning of 2020, added the same rotation blur that Cycles was using, this was the final step to get them to render identically.

Download Blender 2.90 or the Blender Benchmark to try these improvements today.

Thanks to Intel Software for supporting the Blender Development Fund.


Cloth Sculpting improvements in Blender 2.91

$
0
0

Cloth sculpting was initially introduced in Blender 2.83 with the Cloth Brush. In 2.90, the Cloth Filter was added, which allowed cloth-like simulations to be run from Sculpt Mode. 

Blender 2.91 has made major improvements to cloth sculpting, vastly improving the usability of these tools.

New Cloth Snake Hook brush.

Collisions

The Sculpt Cloth Solver in 2.91 features initial support for collisions, both in the brush and in the Cloth Filter. The simulated cloth collides with objects which have a collision modifier, just like the regular cloth modifier.

Cloth Brush with collisions enabled

The cloth solver collisions in 2.91 are raycast based. This means that they can collide with any type of geometry (even non-manifold) and vertices will be stopped by the surface of the collider. Which translates to a new level of precision. 

As a disadvantage, if a vertex from the cloth is inside the collider, the solver won’t be able to move it outside. The plan is to implement SDF (Signed Distance Fields) collisions as a way of solving this issue. This should make it possible to choose between more precision or more stability.

Self collisions are also planned. At the moment, they can’t be implemented efficiently for technical reasons. For those who are interested, the issue is the leaf node size of the PBVH. Reducing the leaf node size is also planned as part of Blender’s sculpt optimization project. After this is complete, self collisions should not be hard to implement. 

Soft Body Plasticity

Cloth Snake Hook brush with different levels of plasticity.

The Cloth Brush now supports plasticity, which controls how much the object tries to preserve its previous shape. This means that when the solver deforms the mesh, it tries to preserve the deformed shape during the simulation. This help to make some brushes more controllable and predictable, as well as allowing the simulation of different materials. 

Support for plasticity collisions is also planned after SDF (Signed Distance Field) collisions are implemented.

Cloth Deformation Target

In 2.91, the Boundary and Pose brush can also be used for cloth simulation by setting the Deformation Target to Cloth Simulation.

Boundary and Pose brushes with cloth simulation deformation target

The deformation these brushes produce allows the creation of common cloth effects, like curtains or sleeves folds…without relying on collisions.

In the future, all brushes in Sculpt Mode will support the cloth deformation target, so it will be possible to run the cloth solver on top of any brush deformation. 

Dynamic Simulation Area

In the first versions of the Cloth Brush, the cloth simulation was constrained by the initial area of the stroke. The alternative would have been to run a full cloth simulation on high poly meshes, which puts huge pressure on a machine’s performance.

Pinch Perpendicular and Inflate with dynamic simulation area on a 800k vertices mesh.

In 2.91, the dynamic simulation area initializes and activates the constraints and the simulation as the brush moves. As the stroke is drawn, more constraints are created and added to the solver, but the simulation only runs using the constraints that are closer to the brush tip. So you can use the brush continuously without interruption, which means you can focus on the creative process.  

Thanks to the fade areas defined by the simulation falloff, the brush can blend the simulated and not simulated mesh seamlessly. 

Other improvements

The cloth brush and filter also include other improvements:

  • The cloth brush can now define a persistent base. This means that the same initial cloth shape can be simulated multiple times with different strokes. So you can start and stop whenever you need.
  • The cloth brush can apply global gravity and simulate the entire mesh during the stroke. In other words, the cloth effect is applied consistently throughout the scene.
Cloth Grab Brush with gravity applied to the whole mesh.
  • There is a new “pin simulation boundary” which allows creating better anchored stroke cloth brushes.
  • The cloth filter now supports custom orientation and has a new Scale mode.
  • There is a new Snake Hook deformation mode, which moves the cloth without creating artefacts in the simulation.
Gravity Cloth Filter with collisions and view orientation.

You can help the project by testing the new tools and hunting for bugs. If you find any please report it from within Blender, simply go to the menu Help → Report a Bug, or watch this video about the process.

Daily builds are available as always on blender.org/experimental.


Support Blender development with a monthly subscription at fund.blender.org

Code Quality Day

$
0
0

The Blender project exists for more than 20 years. During this time, its codebase has grown organically, with a healthy mix of refactors, brand-new code, and core parts that have survived the journey. There are abandoned parts too, left to their own devices. These need fixing but also function quite well in their current state.

Sometimes technical debts start for the best of reasons. But when left unresolved for too long they can seriously impact the long term sustainability of the project, the cross-module pollination, and the overall stability/quality of the project.

To help mitigate these issues, the code quality day project began earlier this year. The goal was to make Blender more welcoming for new developers, as well as helping the software to scale.

The team will work together on making the Blender code easier to understand, rather than focusing on bugs and other tasks.

Read more details on the wiki.

This event happens on the first Friday of each month and started on February 2020. To celebrate its 10th edition let’s look at some examples.

Clang Tidy

The Clang-Tidy project helps catch bugs before they are even reported by fixing typical programming errors. Its support in the Blender codebase requires a lot of changes to prevent false positives, so this is still an ongoing effort.

However, the benefits of introducing automation are clear, and can be illustrated with the following issues.

Too small loop variable

It was potentially handling data from DNA wrongly.
A bit of a corner case, but no excuse to narrow types used in the DNA.


Misplaces widening cast

Typical errors which cause integer overflow.
User-measurable side effects are, for example, not being able to deal with an image of a huge resolution.


Assert side effect

Different behavior in Debug/Release, worsen performance in debug builds.


Platform specific bugs

Clang-Tidy detected a bug in studio light shading, which was only happening on Windows. The code was (ab)using compiler specific extensions.

Quality of Life

IDTypeInfo

Blender ID datablock types (Object, Collection, Armature, …) used to be hard-coded in Blender, each one with its own function calls and unique parameters.

They now changed to a runtime registration to help with the modularization of the read and write code. Now copy data and make locals are proper callbacks defined for each ID datablock.

Enable Sorted Includes

Blender is using clang-format to help unify its code style across the entire program. A recent addition was the option to sort the included headers alphabetically.

The change is quite small: the commit is a one-line change enabling this option for the .clang-format file. However the codebase had to be cleared so it was less dependent on specific (and error-prone) order of includes.

Cleanups

The most popular changes are cleanups. A lot of the new code is created taking the existing codebase as an example. Cleaning up old functions guarantee that old technical debt doesn’t get replicated.

Descriptive variable names

Descriptive variable names help the code to be better understood. They also work as a way of self-documenting parts of the code, and are aligned with modern code practices.

This commit renamed generic int i variables to view_id in the multi-view code.

Macros

A few macros make accessing the Blender data in a more powerful way. However their adoption is not widely spread across the codebase.

This commit updated more than 200 files to use the LISTBASE_FOREACH macro for datablock iteration.


In the coming months the Clang-Tidy should be wrapped up, and soon enough this will be integrated in the patch review system. More cleanups and comments will happen not only as part of the code quality day, but will be incorporated as a daily practice in the development.

It is important to find a good balance between handling technical debt and pushing development forward. Dedicating one day a month for that helps the most pressing issues from dragging on too long, while also building a culture of high-quality new code.


Blender welcomes any contributor to join its project and help make a positive impact in the world. Get involved today!

Nodes Modifier Part I: Sprites

$
0
0

This initial milestone of the Everything Nodes Project will focus on the features that can be used by the Blender Studio. Now that the Sprite Fright short film has been officially announced, the use cases can finally be discussed publicly.

The Everything Nodes Project was started in 2019 with the Particles Nodes Project. The focus at the time was dynamic particles simulation. Around August it was decided to put the project on hold to make sure the design was on point. This was followed by a particle workshop in September where the groundwork for the design was laid.

Moving forward, the focus shifted to tangible use cases that could be validated in production. The priority then shifted to work on geometry nodes, more specifically particle scattering for set dressing. Those were the features the Blender Studio artists were looking forward to the most.

Design

The geometry nodes follow the design of the particle system projects, but its focus is more narrow, and it addresses only static particles.

The underlying design is still the same as for the particle system, with a well-defined integration in the modifier stack as well as a clear dataflow.

The modifier’s input and output are directly connected to the non-node based modifiers. It also contains a node group that can be shared across different objects and Blender files.

Sprints and Agile

The team is working in a squad following the Scrum methodology. The project has two-week periods (sprints) where the team aims to achieve tangible results.

The initial sprint, October 19 to 30, was a mix of preparing the ground work as well as organizing the backlog for the upcoming sprints.

Sample file prepared to validate the pebbles scattering use case

The second sprint aims at finishing up the first use case (pebble scattering) and having it fully integrated in master for the upcoming Blender 2.92.

The third sprint will focus on finishing set dressing for the Sprite project. It includes tree and flowers scattering, as well as procedural modeling for tree bark and moss (pending node design).

The team is still adjusting the planning to its delivery capacity. So there is a chance that procedural modeling happens as a separate sprint.

Plan of Attack and Backlog

There will be a constant balance between polishing and developing new features. This will depend on feedback received through the iterations.

The backlog is organized to try to match the immediate needs of Sprite Fright:

  • Pebbles scattering – set dressing
  • Flowers and trees scattering – set dressing
  • Tree bark – procedural modeling
  • Moss on tree – procedural modeling
  • Campfire – mantaflow integration
  • Hair spray – dynamic particles
  • Salt – dynamic particles
  • Snails crawling – dynamic particles

After this is out of the way, there are other features that would have helped previous open movies:

The goal of the team is to cover the basics for well-defined use cases and hand it back to the community.

With a well-defined design and the core functionality guaranteed to be solid, the community should be able to bring the feature-set to completion.

Meet the Team

The team consists of:

  • Dalai Felinto – Product Owner
  • Hans Goudey – Developer
  • Jacques Lucke – Lead Developer
  • Jeroen Bakker – Scrum Master
  • Pablo Vazquez – Product Design and Testing
  • Simon Thommes – Feature Requirements and Demos

Community Involvement

The work is organized publicly and the team welcomes anyone interested in contributing to development. And this has started already: contributor Léo Depoix implemented the subdivision surface node.

The sprints backlog is confirmed at the beginning of each sprint, and can be seen in the workboard.

Simplified view of the geometry nodes work board.

Tasks that are critical to the sprint are handled by the team. However, there are always plenty of “good to have” features that are not prioritized by the team such as extra nodes and refactoring.

Follow the discussions on geometry-nodes-squad on blender.chat (read-only). Contact the team members to find what you can help with.

Updates on this project will be communicated via the bf-committers mailing list. To help testing the features join the discussion on devtalk.blender.org.

Everything Nodes and the Scattered Stone

$
0
0

The much anticipated first installment of the everything nodes projects is finally integrated into Blender. This will be incorporated in the upcoming Blender 2.92, to be released in 2021.

Pebbles scattering sample file.

This iteration implements the initial nodes required for the “scattering pebbles” use case. Artists will be able to set-dress their scenes by randomly placing simple assets, then tweaking a variety of parameters for full artistic control. A total of 24 nodes are available initially, and more will come.

Coming next is the support of “trees and flowers”. In this case different assets that need to co-exist in the same set will be combined. As well as scattering of full collections instead of individual objects.

Target file for the upcoming trees and flowers use case.

To learn more check the user manual. Sample files are also available in the project page. For more details on the working process check the recent post about Geometry Nodes.

Geometry Nodes and the Melting Iceberg

$
0
0

The geometry nodes project will soon make its debut in the upcoming Blender 2.92. Dalai Felinto made a presentation with an overview of the project process so far, with emphasis on what the team learned from it.

Blender Today – Host: Pablo Vazquez, Guest: Dalai Felinto

To change a development culture takes time. However, most of the outtakes from this process are nothing new to Blender. Design-driven development and close collaboration with artists is at the core of what makes Blender so successful. The most recent challenge, however, is to find ways of handling this at scale.

Working with multi-faceted teams is a answer to some of those questions. And while the specific framework is secondary, the principles of agile development serve Blender well.

This team-based and design driven development will be reproduced in other projects for this quarter:

  • Library overrides (proxy replacement, review and documentation)
  • Asset browser (local asset browser and pose libraries)
  • Geometry nodes (polishing, attribute workflow, tools)

Related Links:

Module teams for core Blender development

$
0
0

Blender is growing fast. With the success of the Blender Development Fund and industry support, it’s important to make sure that the blender.org project organization remains future proof. Numerous activities around Blender are now performed by full-time employees or people working remotely on a grant.

Together, they take care of many core development projects and topics such as improving code quality, documentation, developer operations and support. All very important, but how do these efforts relate to work done by other contributors or by volunteers?

In the past months, I have been working with the Blender Institute crew to tackle our growing plans (and pains). When a team gets bigger you need operations management, coordinators, and human resources specialists. We need a definition of developer roles such as principal engineers, seniors, and product designers. And we need to define how projects are being organized in general.  

After reviewing popular development organizational styles (such as ‘agile’ or ‘squads’) I felt that this direction was a dead-end for Blender. We should not emulate a software company. I believe there is one aspect of Blender we should never give up on:

Blender is a community effort.

As we all know, communities are messy, noisy and disorganized. It takes a lot of energy to get an online community to move in a chosen direction, to agree on things and to get them to collaborate on topics. And even worse, open source communities often lose top talent because the best feel dragged down to the level of the group as a whole — which includes beginners or others still developing their abilities. That’s the main criticism on community-driven projects. How do you combine striving for excellence and the quest for quality with a public project that’s accessible to everyone?

Luckily we already had an answer: the module team organization we’ve used for almost 20 years. It just needed an upgrade.


Let’s divide Blender tasks into three categories. Operational, Tactical and Strategic.

Operational: bug triaging, onboarding, documentation, website development, testing, communication, facility management, administration.

Tactical: well-defined short term development projects, work that ends up in releases, student projects, maintenance and upgrades of the code, wrap up unfinished features, make Blender releases. 

Strategic: general roadmaps, product designs, industry relationships, research, mission critical software projects, keep top talent on board.

The Blender organization can be held responsible for all operational aspects, facilitating the blender.org project, and welcoming contributions from the community. In these roles we currently employ several people; including a DevOps engineer and forum moderators.

Developers hired by Blender Institute will be assigned to specific strategic projects. These usually have only one goal: to get innovative designs working as ‘MVP’ (minimal viable prototype) and hand them over to the module teams as quickly as possible.

This makes the modules teams on blender.org the “tactical teams” in Blender. That’s where the real open source dynamics kicks in. This is where the actual magic happens. It’s public, sometimes messy and noisy, but often incredibly rewarding and surprisingly effective. Good examples are work contributed in the areas of Grease Pencil and Sculpting.

Strategic contributions to Blender can also be provided by other organizations or teams. This is already happening — for example, NVIDIA and Tangent Animation assigned engineers to help with integrating Pixar’s USD in Blender.

Obviously it’s the Blender Foundation’s task to frequently present and discuss strategic roadmaps for Blender, and to make sure the module teams are aligned. That’s a topic for another post.


Module Teams

Modules are largely free to organize themselves. Each type of module might require different management styles or procedures. Some modules will be more difficult to join (Cycles & Rendering), other modules might be stricter in terms of accepting patches (Core Blender module).

Within a module there are two roles; the “owners” and the “members”.

The main rules for modules are:

  • Module owners are empowered to commit code.
  • Module owners decide together as a consensus (unanimous).
  • Module members need an owner to accept or review their work.
  • Modules only use public blender.org platforms (code & communication).

Blender module teams should be as large as possible. If they grow too big, they can split up. Technical Artists (TAs) must also be included among each module’s members

Module teams are responsible for issues in their own code (the module) but should feel free to move open issues to a todo list to deal with later. Module Owners are held accountable: their role implies they accept responsibility. 

Modules can expect wide-ranging support by the Blender organization, both for operational tasks but also for Development Fund grants (to retain essential people).

You can read more about how the module organization works in the Blender Wiki.

New: Rendering Module

The Cycles, Eevee and workbench rendering projects in Blender have evolved to become reliable production quality software, using advanced engineering principles. For that reason we decided to join these projects into one module and support the module owners to invite senior software developers as members only. In that sense the module can operate as a tightly organized ‘squad’, enabled to manage complex design and engineering tasks and ensure Blender rendering remains future proof.

The Blender organization will support this module on an operational level, by providing sufficient developers and TAs to help with bug triaging, patch reviews, testing, onboarding and documentation.

New: Core Module

Members of this module will guard and protect the core of Blender’s software design; that includes the DNA and RNA libraries, the .blend file format, undo system, core kernel code, the windowmanager and editors design, and general support libraries.

Membership of this module will be restricted to people with several years of experience with Blender development. The module will also be responsible for general guidelines on code standards and engineering practices.

Thanks for reading! 

Ton Roosendaal
Chairman Blender Foundation

Amsterdam, 15th February. 2021

Render Modules Update

$
0
0

The Blender development team keeps growing, and we are in the process of improving the organization and hiring new employees to accommodate this. For me personally, this provides an opportunity to focus again on what I’m most passionate about.

My new role – as principal engineer – will be focusing on rendering development in Blender, and I will spend most of my time as a software developer working on Cycles again.

My previous role was as chief software architect for Blender. I will continue to be available for advice on software design for any part of Blender. But I will no longer be as much involved in organizing, reviewing or signing off on development projects outside of rendering.


There were two rendering module teams in Blender:

  • Render & Cycles: includes Cycles, Blender render pipeline, color management, materials, textures, etc.
  • Eevee & Viewport: includes Eevee, 3D viewport drawing, OpenGL & Vulkan.

These will be merged into a single Rendering module team. This is an organizational change, and one that has already happened for the most part. We’ve been holding weekly rendering meetings for planning and discussion of both modules, and will continue to do so.

Note that Cycles and Eevee remain separate renderers. We will to work together to ensure feature compatibility and make changes that benefit both native and external renderers.

The module owners are Clément Foucault, Jeroen Bakker and myself. Adding Jeroen as a module owner reflects his existing high involvement in both modules, and together with Clément his focus is on the Eevee & Viewport part.

For Cycles there are various active individuals and companies contributing to the module. Kévin Dietrich is currently working on Cycles with a grant for specific interactive rendering optimizations, and we’re also looking to hire another developer for general Cycles and Blender render pipeline development.


Rendering is an area that benefits from long term continued improvement. We have much work to do lowering render times, handling bigger scenes, simplifying settings, adding more advanced and easy to use shaders, and good interop with the rest of Blender and other apps. In the end it’s all about making an idea into a beautiful render as seamless as possible.

We are looking for artist module members to join the rendering modules, particularly to help test new features, make demo files, and create release notes and docs. More developers are also always welcome, especially to help with bug fixing, code review and incremental quality of life improvements.

– Brecht Van Lommel


Blender 2021 Roadmap

$
0
0

2021 promises to be a busy and exciting year. We will be working on the second LTS release and on Blender 3.0, which includes a lot of new development. This year also marks the 10th anniversary of Cycles.

There will be more emphasis on the modules as a way for everyone in the development community to get involved. Combined with the Blender HQ project teams, this should help bootstrap new and existing initiatives while making sure they are maintained in the long run.

Abstract '3' surrounded by colourful hairs

Planning

The year began by assigning three core developer teams to focus for two months on the following projects: asset browser and pose library, library overrides, and geometry nodes. This began in February and will be wrapped up soon.

Blender 2.93 will be released in late May. Much like version 2.83, this will be an LTS (Long-term Support) release, meaning it will be maintained for two years.

Q2 also sees the kick-off of the animation character pipeline project — pending unforeseen restrictions.

NVIDIA’s Industry quality work done for the USD importer is already being reviewed. It should debut after Blender 2.93 is released.

Coming in early summer: a usability workshop will be held in Amsterdam with Blender designer William Reynish and other UI/UX contributors — all in preparation for Blender 3.0.

Projects

  • Asset browser and pose library
  • Library overrides
  • Geometry nodes
  • Vulkan
  • Grease Pencil
  • Blender 2.93 LTS
  • Cycles development
  • Animation character pipeline
  • USD importer
  • Blender 3.0 – user interface workshop

Asset browser and pose library

The asset browser project dates back to 2016 (when it was called asset management). Over the years and multiple iterations, its goals narrowed. But since February, the project has been rebooted with a broader set of goals.

Spring sample file with the pose library

For years the Blender Studio has needed a robust pose library system for its animation projects. Sprite Fright provided the perfect opportunity to address this while also helping the asset browser project.

It’s likely that the pose library system will become the first target for the asset browser project. It will be completely integrated in both the viewport and animation editors, and help focus the asset browser project in time for Blender 3.0.

Library overrides

Replacing the old animation proxy system remains a work in progress. As part of this process, the current development cycle aims to finish rigging syncing, and wrap up the system’s final documentation.

Sprite Fright is the first Open Movie to use the library overrides system. This helps stress test support for multiple animated instances of the same character.

Library overrides - sprites fright demo

This initial polishing cycle will be followed by another project to tackle the restrictive pipeline. At that time riggers will be able to hand-pick which properties animators are allowed to override. This still depends on finding interested studios for testing and feedback — as well as someone to help with UI/UX.

Geometry Nodes

The second project related to geometry nodes has just ended. It expanded on the initial project by adding new nodes and ways for artists to work with this system. This eight week-long cycle had four pillars:

  1. Wrapping up Blender Studio requirements, and polishing.
  2. Everything nodes design.
  3. Attribute workflow.
  4. Node tools.
Flowers and grass distribution sample file

Twenty new geometry nodes were introduced. These include the long-awaited mesh primitives, plus more advanced scattering, procedural modelling options, and a new texture based pipeline. A brand-new spreadsheet was added to help debug complex node trees. Besides, the project added usability improvements such as attribute search and error messages.

A few sprints were also dedicated to design and prototyping, helping to prepare the ground for future projects. This included collection nodes, re-validation of the hair nodes design, node tools, pages and portals.

Vulkan

The drawing backend is being prepared to receive Vulkan. This abstraction of the drawing API will allow Blender to use more modern libraries for drawing. Which also helps make EEVEE memory more efficient.

Vulkan logo

It’s worth noting that there are no immediate performance boosts expected from Vulkan’s integration. However, it will help make Blender future proof and ready for vendor-specific platforms.

Grease Pencil

The 2D drawing tools in Blender got a big improvement with the recently added Line Arts.

Blueprint of an airplane
Artwork by Yiming Wu.

The emphasis this year was on new Line Art modifiers, storyboarding, I/O, better bézier editing and features for 2D/2.5D animation feature films.

Blender 2.93 LTS

The Long-term Support pilot was a success with 13 versions in one year, and hundreds of ported fixes. Downloads were in the  hundreds of thousands. The second LTS will come in April.

Release schedule with LTS overlaps

Blender 2.83 LTS will be maintained for another year, while 2.93 LTS will be maintained for two years. The long-term stable releases will receive fixes for high priority bugs and regressions, besides drivers compatibility updates.

Cycles development

Through industry support, Cycles saw big improvements in its render API thanks to a dedicated developer working closely with Facebook.

Cycles rendering of a Lone Monk
Artwork by Carlo Bergonzini.

Following that, the Blender project is hiring a new senior rendering engineer to help the Cycles team. The hiring process is almost complete.

The 10th anniversary of Cycles is on April 23rd. To celebrate, Blender is working on a special surprise. Save the date!

Animation character pipeline

Feature film production is a real challenge. The scale alone brings unique problems that go beyond mere performance improvements. Specific tools are required in order to create a streamlined workflow, helping the entire team maintain the same level quality.

Rex facial expression tests - Sprites Fright

In the past, this project was put on hold due to lack of industry support. It required both a talented multi-disciplinary team and funding in order to bring everyone onboard. Happily, we now have both! Industry veteran Jason Schleifer will act as the liaison for this project.

The pose library and library overrides are part of the tools expected for this pipeline. However better tools for rigging, playback and animation in generation will also be tackled.

USD importer

NVIDIA is involved in bringing USD support to Blender. Developer Michael Kowalski is working directly with the rest of the Blender development team in order to make this happen.

USD logo

The initial plan is to focus on USD importing. As the exporter code was already in place, this seemed like the obvious place to start. Michael’s work is currently under review with the aim of incorporating it into the Blender 3.0 release.

The importer is built on top of the existing I/O pipeline. However Collections for I/O has also been discussed and may become part of the USD integration.

Blender 3.0 – user interface workshop

Finally on the agenda is a workshop to ensure that the Blender 3 series is implemented with all the necessary care.

Blender interface mockup
Multi-object select mockup by William Reynish.

The Blender development team is working towards a Blender 3.0 release in the Q3, probably in September. As always when it comes to major releases, we’ll take the time to revise some of Blender’s design choices and introduce big solutions to existing issues.

As soon as international travelling is again possible, there will be a UI/UX workshop in Amsterdam. The goal is to find outstanding solutions to pressing problems.

Surprises

There is value in solid planning. However not everything should be set on stone ahead of time. There are plenty of other projects expected for 2021. But they will be revealed throughout the year. The idea for the core team is to work in 6-8 week long projects, hand over to the modules, rinse and repeat.

The short cycle — alternated with weeks regular module development — will help more projects see the light of day. This also ensures that the development team can remain nimble and pick the best projects to tackle at a given time. This keeps priorities fresh and planning realistic.

As a teaser, here are a few ideas being considered:

  • Independent physics clock in viewports
  • Mesh editing optimization
  • Brush manager for painting and sculpting
  • Snapping improvements
  • Real time viewport video compositor
  • Collections settings for persistent I/O and baking
  • Restrictive overrides
  • Collection nodes
  • Dynamic particles

Cycles X

$
0
0

Today it’s been exactly 10 years since Cycles was announced. In the past decade Cycles has developed into a full-fledged production renderer, used by many artists and studios. We learned a lot in those 10 years, things that worked well, but also things that didn’t work well, or became outdated as rendering algorithms and hardware evolved.

We’re keen to make bigger improvements to core Cycles rendering. However some decisions made in the past are holding back performance and making it difficult to maintain the code. To address that, Sergey and I started a research project named Cycles X, with the aim is to refresh the architecture and prepare it for the next 10 years. Rather than finding quick fixes or optimizations that solve only part of the problem, we’re rethinking the architecture as a whole.

The Project

Broadly speaking, the goal of the project is to:

  • Improve the architecture for future development
  • Improve usability of viewport and batch rendering
  • Improve performance on modern CPUs and GPUs
  • Introduce more advanced rendering algorithms

Our first target was to validate the new architecture. To that end, we’ve implemented a prototype of a new GPU kernel, and new scheduling algorithms for viewport and batch renders. There’s just enough functionality to render some of our benchmark scenes now.

Current Cycles X kernel graph

Today we’re sharing some initial performance results, and publishing the code to collaborate with Cycles contributors. A technical presentation for developers on the new architecture is available, and the code can be found in the cycles-x branch on git.blender.org.

There is much be done. We expect it will take at least 6 months until this work is part of an official Blender release.

Initial Results

First, some results from GPU rendering with well-known benchmark scenes. Scenes have been modified to remove features like volume rendering, which are not implemented yet.

Be aware that the numbers will change as we keep working on the new architecture. OptiX support was added just a few days ago by Patrick Mours.

The most significant improvements are in interior scenes with many light bounces and shaders, where the new kernels can achieve higher occupancy and coherence.

CPU rendering performance is approximately the same as before at this point, but the new architecture opens up new possibilities there as well.

Secondly, we’ve been working to improve viewport rendering. Faster rendering kernels help, but we also found that improving the scheduling, timing, and display mechanisms can make the viewport feel more interactive. New viewport support for adaptive sampling and batching samples make it so the image cleans up faster once the first few samples are done.

CPU viewport rendering
CPU viewport rendering with Open Image Denoiser
GPU viewport rendering
Viewport adaptive sampling comparison

Looking Forward

In the coming months we will try more optimization ideas, and restore missing functionality. When functionality is missing, it’s usually because we want to take a different approach in the new architecture. Some examples:

  • Volume rendering: we plan to implement ray-marching and light sampling with more modern algorithms
  • Shadow catchers: we’ll try a different algorithm that can take into account indirect light
  • Multi-device rendering: we’ll experiment with more fine-grained load balancing without tiles

Beyond this, the new architecture should let us more easily fit in rendering algorithms like path guiding, which we will experiment with and research how they can be made GPU friendly.

Deprecation

As part of the new architecture, we are removing some functionality. Most notably:

  • OpenCL rendering kernels. The combination of the limited Cycles split kernel implementation, driver bugs, and stalled OpenCL standard has made maintenance too difficult. We can only make the kinds of bigger changes we are working on now by starting from a clean slate.
    We are working with AMD and Intel to get the new kernels working on their GPUs, possibly using different APIs. This will not necessarily be ready for the first release, the implementation needs to reach a higher quality bar than what is there now. Long term, supporting all major GPU hardware vendors remains an important goal.
  • Branched path tracing. We are working to improve sampling algorithms to make this obsolete, and more automatically assign samples where needed. Improved adaptive sampling and light importance sampling are key here.
  • NLM denoiser. AI denoising algorithms and in particular OpenImageDenoise generally yield better results, and we will optimize the architecture and workflow for them.

These features will remain available and supported in 2.83 and 2.93 LTS releases.

Mesh Editing Optimization – Initial Steps

$
0
0

Since releasing Blender 2.8x there have been performance regressions with mesh editing that haven’t been addressed. While Blender developers were aware of these problems, early on a lot of time was spent investigating and fixing bugs in order to get feature parity with 2.7x. Later some improvements were made and Blender was once again functional for artists at Blender Studio (where they were not doing a lot of high-poly mesh editing).

Sprite Fright by Blender Studio – Forest Mesh Editing

Having said this, artists in the community have been asking for improvements in this area (see thread on blenderartists).

This kind of development needs focused time to investigate issues and try different solutions. After all, it is not so easy to fit this between regular development tasks. So the Blender development team has set aside some time to improve the situation with a 6-8 week sprint involving 3 developers (Germano Cavalcante, Jeroen Bakker and myself).

The team started investigating bottlenecks in more detail with a focus on edit mesh. Early test results show there is room for significant performance improvements.

In brief, the main bottlenecks are uploading data to the GPU as well as redundant data-duplication and GPU-data rebuilding that can be skipped entirely.

For details see:

… not so fast!

While gains in mesh editing should be achievable, it’s possible users with complex files won’t notice much difference. For instance, files where subdivision surface is the bottleneck require development specifically to optimize OpenSubdiv. In general, files with heavy use of modifiers may not see much in the way of overall speed improvements, since that’s not the initial target at the moment.

The focus of this sprint will be on high priority areas, while other projects can be prioritized in the future.

— Campbell Barton

Blender OSS and Licenses

$
0
0

Blender 2.93 marks the second installment of the LTS program. This time around, the bar for sharing the builds was raised to a new level. Blender is now fully compliant with the highest standards required by third parties to redistribute it.

"The Lone Blue Widebeest (GNU)" with changes, original credit: Ray in Manila - CC BY 2.0
The Lone Blue Widebeest (GNU)” with changes, original credit: Ray in Manila – CC BY 2.0

This all started in May 2020. At the time Blender’s license was explicitly mentioned in the website as well as in a single license document present in the code base. However this didn’t account for the components Blender depended on.

More specifically, while Blender itself is licensed as GPL 2.0 or later, Blender depends on multiple libraries that each have a different license. To quickly mitigate that, Blender 2.91 started to ship with a license folder that allows users to check the full licenses of all the components built with Blender.

With the release out of the way, the Blender team could revisit this topic again. After a second pass on the state of the licenses, and …

This was not enough.

Some licenses require the full license copyright to be present with the distribution of the software, not only the license text. Usually the copyright refers to the original author of the library or the project maintaining it. For that, in Blender 2.92 an Attribution Document was built using OSS Attribution Builder.

The attribution document is now part of the release checklist and updated every release. While the license folder contains all the licenses of the internal components of Blender, the attribution document covers all the external libraries.

Another Blender release was also another opportunity to assess the requirements for its distribution, and …

This was still not good enough.

One of Blender’s long-standing issues is that a lot of the libraries it depends on are scattered around the web. What happens if the servers are down? Or if the library version needed for an old version of Blender vanishes from their backup?

Add to this that some of those libraries have licenses that require Blender to either host a copy of the source code, or to provide an “offer for source”. For years, third parties distributing Blender had simply trusted Blender to keep this in check — either by making sure the links to other projects are working or that there are local copies hanging around.

To solve this once and for all, Blender 2.93 source code will be distributed with all the libraries used for that version. For incremental versions and LTS updates only the main Blender source code package will be distributed. This was also retroactively implemented for the current stable releases of 2.83 LTS and 2.92 (see links below).

Blender is now shipped with even higher standards. This was a year long endeavour but worth it. Hopefully this journey inspires and assists other open source projects to give licenses the attention they deserve.


This is part of the ongoing effort to professionalize Blender’s infrastructure. A big thanks to the team at Amazon’s Open Source Programs Office for bringing up this topic originally and providing guidance.

Timeline:

Links:

EEVEE’s Future

$
0
0

EEVEE has been evolving constantly since its introduction in Blender 2.80. The goal has been to make it viable both for asset creation and final rendering, and to support a wide range of workflows. However, thanks to the latest hardware innovations, many new techniques have became viable, and EEVEE can take advantage of them.

A new beginning

For the Blender 3.x series EEVEE’s core architecture will be overhauled to make a solid base for the many new features to come. The following are the main motivations for this restructuring.

Render Passes and Real-Time Compositor

A core motivation for the planned changes to EEVEE’s architecture is the possibility to output all render passes efficiently. The architecture is centered around a decoupling of lighting passes, but will still be fully optimized for more general use cases. Efficient output of render passes is a must in order to use the planned real-time viewport compositor at its full potential. This new paradigm will also speed up the rendering process when many passes are enabled, and all render passes will be supported, including AOVs.

Screen Space Global Illumination

The second motivation for the rewrite was the Screen Space Global Illumination (SSGI) prototype by Jon Henry Marvin Faltis (also known as 0451). Although SSGI is not necessarily a good fit for the wide range of EEVEE’s supported workflows, strong community interest in the prototype shows that there is a demand for a more straightforward global illumination workflow. With SSGI, bounce lighting can be previewed without baking and is applied to all BSDF nodes. This means faster iteration time and support for light bounces from dynamic objects.

A demo of the SSGI prototype branch by Hirokazu Yokohara.

Hardware Ray-Tracing

Supporting SSGI brings ray-tracing support a step closer. The new architecture will make the addition of hardware ray-tracing much easier in the future. Hardware ray-tracing will fix most limitations of screen space techniques, improve refraction, shadows … the list goes on. However, implementing all of these will not be as easy as it sounds. EEVEE will use ray-tracing to fix some big limitations, but there are no plans to make EEVEE a full-blown ray-tracer. For instance, ray-tracing won’t be used for subsurface scattering (not in the traditional way at least).

Note that hardware ray-tracing is not a target for the 3.0 release but one of the motivations for the overhaul.

Volumetric Improvements

Volume rendering is also something that should be improved with this update. In the current version of EEVEE released in 2.93, object volume materials are applied to the object’s entire bounding box and are still constrained to a rather coarse view-aligned volume grid.

The goal is to improve that by providing a volume rendering method that evaluates volumes in object space. This means volumes will be evaluated individually and per pixel allowing for high fidelity volumetric effects — without paying the cost of the unified volumetric approach (the current method used by EEVEE). This would also make volume shadow casting possible for those objects.

These changes would make the following use cases trivial to render, compared to current implementation where it would be impossible to get this quality (examples rendered using Cycles).

Water surface with absorption volume, rendered with Cycles
Highly detailed smoke simulation with shadows projected onto the scene, rendered with Cycles

New shader execution model

The current shader execution is straightforward and executes all BSDF nodes in a material node graph. However, this can become very costly with complex materials with many BSDF nodes, leading to very long shader compilation times.

The approach in the rewrite will sample one BSDF at random, only paying the cost of a single evaluation. A nice consequence of this is that all BSDF nodes will be treated equally with all effects applied (Subsurface Scattering, Screen Space Reflections, etc…).

And many more …

There is a huge list of features that are just waiting for this cleaner code base. That includes Grease Pencil support, vertex displacement, panoramic camera support, faster shader compilation, light linking… etc.

This is also an opportunity to rewrite the whole engine in C++. Hopefully this will help external contributions and encourage experimentation.

Conclusion

This endeavor has already started in the eevee-rewrite branch. Some features listed in this post are actually already implemented, however at this point it is to early for testing. Soon there will be experimental builds for developers to test. At that time the development team will create threads on devtalk.blender.org to follow feature-specific development.

The target plan for Blender 3.0 is to have all features back with the addition of some new features. The final set of feature targets for 3.0 is not yet decided.

Viewing all 170 articles
Browse latest View live