Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Attribute-aware error metrics for simplification #158

Open
fstrugar opened this issue Jun 23, 2020 · 14 comments
Open

Attribute-aware error metrics for simplification #158

fstrugar opened this issue Jun 23, 2020 · 14 comments

Comments

@fstrugar
Copy link

Hi! I'm playing with https://developer.nvidia.com/orca/amazon-lumberyard-bistro dataset and meshoptimizer and I've noticed this particular failcase related to the way the corners were authored.

For example, here is the original chair mesh:
image

Notice the rounded corners with shared vertices. They survive the first pass of meshopt_simplify to half the number of triangles fine:
image

However, once the triangles between two sides facing at 90deg get folded and the sides start sharing the vertices, the vertex normals can no longer be correct:
image

What would be a solution to (automatically) preventing this?

I was thinking of adding additional custom skip code in the 'pickEdgeCollapses' loop if angle between vertex normals is above certain threshold but I'm sure there's a better/simpler solution, perhaps already there? :)

(instead of preventing collapse, could also allow it but duplicate verts so normals aren't shared?)

Thanks for the great library!!

@zeux
Copy link
Owner

zeux commented Jun 25, 2020

Yeah, so there's a few ways to fix this.

One is to discard and recompute normals post-simplification, possibly splitting vertices when the crease angle is too sharp. This works around the problem in a way, but of course it's not very convenient.

Another one is to factor the normal delta into the simplification as an extra error. This can be done by comparing the normals alongside the edge that's considered for collapse, or it can be done by introducing normal into the quadric weight. It's on my list to experiment more with this, there's a simplify-attr branch in this repository from my last attempt but when I worked on this at the time it became clear that this isn't very simple so I decided to take a break and think about this more.

This isn't implemented right now though, and it's definitely good to address this but I'm not sure what the best solution is, since ideally it's not just the normals that need to be taken into account, and balancing ease of use, performance and quality here is tricky...

@fire
Copy link

fire commented Dec 24, 2020

I also encountered this problem from the

image

mesh in #206 (comment)

@fire
Copy link

fire commented Dec 29, 2020

@zeux Would you be able to look at this?

Thanks for your amazing work on meshoptimzier.

@zeux
Copy link
Owner

zeux commented Dec 29, 2020

I believe this part of the comment above accurately reflects the plan here:

Another one is to factor the normal delta into the simplification as an extra error. This can be done by comparing the normals alongside the edge that's considered for collapse, or it can be done by introducing normal into the quadric weight. It's on my list to experiment more with this, there's a simplify-attr branch in this repository from my last attempt but when I worked on this at the time it became clear that this isn't very simple so I decided to take a break and think about this more.

Since this issue is still open, you can assume I'm going to look into this at some point in the future; when exactly this point will be I can't say, as this requires some further research on how to best integrate the attribute metrics with geometry metrics in a way that is reasonably easy to tune once instead of having to tweak weights per model.

@zeux zeux changed the title Question on normals and/or custom error metrics Attribute-aware error metrics for simplification Dec 30, 2020
@fire
Copy link

fire commented Apr 8, 2021

Factor the normal-delta into the simplification as an extra error by comparing the normals alongside the edge considered for collapse.

The metric can be done by introducing normal into the quadric weight.

Since comparing attributes is not simple, would there be any other approaches?

I wanted to look into this, but a bit lost.

Edited:

I tried using your attribute branch and didn't see any major problems.

https://github.com/fire/meshoptimizer/tree/simplify-normal-attribute

@zeux
Copy link
Owner

zeux commented Apr 9, 2021

I tried using your attribute branch and didn't see any major problems.

Yeah, it needs more work to be production ready wrt metric, I think the branch predates some geometric improvements - and also needs some interface and optimization work. FWIW I plan to resume this in the next few weeks.

@fire
Copy link

fire commented Apr 9, 2021

Is there's a better way to define normal being close enough? It seems to block optimizations of any curved surface. Only flat planes get optimized.

Not sure how to allow the first pass of decimation in the chair example and then block the ones that fail.

I wish there was a way to optimize the indices with the normals on the second try.

My thoughts are using quad remeshing or isotropic remeshing, but that has a lot of work, but it gives the optimizer more room to work.

Notes:

@zeux
Copy link
Owner

zeux commented Apr 9, 2021

It seems to block optimizations of any curved surface.

That's because the metric needs work I believe; the code in that branch right now is very challenging to tune properly, which is part of why this hasn't been integrated yet. I'm not aware of existing research that's more promising than the general approach used there but since that code isn't production ready it can have all sorts of issues, and likely requires taking a path that hasn't been explicitly documented in academia (at least it was the case for geometric error, where the approach that meshoptimizer uses is inspired by prior research but doesn't follow it precisely).

Remeshing is orthogonal to simplification - it can definitely make topology-aware simplification easier, but doesn't solve the problem by itself and you still need attribute awareness within the simplifier to solve significant attribute distortion from this thread.

@fire
Copy link

fire commented Apr 9, 2021

I'll do some literature searches for vertex normal merge, collapse, and flip metrics.

If you have any keywords I can search that'll help too.

Edited:

Will list some promising papers:

https://dl.acm.org/doi/pdf/10.1145/2425836.2425911

Edited:

I'm going to use the 6 element truncated 3x3 orientation matrix to store the normal. This uses 6 attributes. It seems to work ok.

@fire
Copy link

fire commented May 22, 2021

godotengine/godot#47764

@zeux

Can you take a moment to see if this is legitimate, the Godot Engine contributors had concerns about applying patches on top of meshoptimizer that aren't merged.

I wanted some motion on this topic.

Thanks!

@Zylann
Copy link

Zylann commented Jul 4, 2021

Considering the title of this issue:

I have voxel meshes which can contain encoded texture splatting parameters in extra attributes (repurposing color and UV) in additional vertex arrays.

Problem: removing vertices in that scenario directly reduces quality even if geometry is preserved. Simplification only seems to care about vertex positions, which means there should be more information to give meshoptimizer, or some way to customize the comparison between vertices.
Tangent problem: my meshes use multiple streams (structure of arrays), but the current API seems to only take one.

I'm wondering if simplification is actually suited in that situation, otherwise it doesnt sound actually... simple (the kind of data I'm storing is packed sets of indices and weights).

Does this match the current issue or should I open another?

@zeux
Copy link
Owner

zeux commented Jul 4, 2021

Yes that’s the same problem as highlighted in this issue. Attribute aware simplification will be exposed as a separate function with separate attribute stream inputs.

@Adi-Amazon
Copy link

With risk of stating the obvious, my suggestion would be to:

  1. Have attributes aware simplification as suggested
  2. On top of this, it makes perfect sense to expose user defined threshold for each of the control parameters - for example normals crease angle, minimal distance for position weld, color and UV differentiation, etc..

@zeux
Copy link
Owner

zeux commented Nov 2, 2023

Copying the comment from #524 on some future work involved here; the issue will stay open as the algorithm improves further:

  • The attribute metric is not perfect - it's functioning correctly and is numerically stable, but it misses certain obvious visual errors. I have some ideas on how to improve this but it requires significant math modeling work.
  • The attribute quadrics are not properly aggregated across discontinuities. This is the case for Godot's fork as well.
  • The resulting error, as well as error limit, include the attribute error. Godot's fork adjusts output error to only track distance, but keeps error limit as is. I might instead add a second output error parameter, we'll see. This also requires tracking both errors, which increases the collapse list structure if done naively.
  • The attribute and geometry errors are hard to balance. There are some ideas I'd like to try around this, but right now very careful weight tuning is required for good results, and the weights strongly depend on the type of attribute involved.
  • In presence of attributes, some automatic optimizations like vertex welding are possible that would significantly improve the quality for some topology-constrained meshes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants