Sloppy Tolerances

blobfish

CAD community veteran
Been looking into working around an offset bug. As the bug report describes, some offset routines just flat out assign 0.001 as a tolerance without any calculation. Upon investigation, it appears that most, if not all, the settng of tolerance values are predicated upon the new tolerance being greater/looser than the current tolerance. That makes sense as the greatest calculated tolerance should be the final tolerance. However that practice makes it hard to 'tighten' tolerances. Here is what I think is the minimum work flow to tighten tolerances.

C++:
int main (int /*argc*/, char** /*argv*/)
{
  try
  {
    auto report = [](const TopoDS_Shape &s)
    {
      std::cout << std::setprecision(std::numeric_limits<double>::digits10 + 1) << std::fixed
      << "  Max Face Tolerance:  " << BRep_Tool::MaxTolerance(s, TopAbs_FACE) << std::endl
      << "  Max Edge Tolerance:  " << BRep_Tool::MaxTolerance(s, TopAbs_EDGE) << std::endl
      << "Max Vertex Tolerance:  " << BRep_Tool::MaxTolerance(s, TopAbs_VERTEX) << std::endl
      << std::endl;
    };
    
    auto inShape = readShape("../../offsetTolerance3.brep");
    assert(!inShape.IsNull());
    std::cout << std::endl << "Input Shape:" << std::endl;
    report(inShape);
    
    ShapeFix_ShapeTolerance tighten;
    tighten.SetTolerance(inShape, Precision::Confusion(), TopAbs_WIRE);
    std::cout << std::endl << "Tighten Shape:" << std::endl;
    report(inShape);
    
    std::cout << std::endl << "Output Shape:" << std::endl;
    if (!ShapeFix::SameParameter(inShape, false))
      throw std::runtime_error("Shape fix failed");
    report(inShape);

    std::cout << std::endl << "Program finished normally" << std::endl;
  }
  catch (const Standard_Failure &error)
  {
    std::cout << "OCC Error: " << error.GetMessageString() << std::endl;
  }
  catch (const std::exception &error)
  {
    std::cout << "My Error: " << error.what() << std::endl;
  }

  return 0;
}

Here is the output of that code:
Code:
Input Shape:
  Max Face Tolerance:  0.0000001000000000
  Max Edge Tolerance:  0.0010000000000000
Max Vertex Tolerance:  0.0010000000000000


Tighten Shape:
  Max Face Tolerance:  0.0000001000000000
  Max Edge Tolerance:  0.0000001000000000
Max Vertex Tolerance:  0.0000001000000000


Output Shape:
  Max Face Tolerance:  0.0000001000000000
  Max Edge Tolerance:  0.0000001344348549
Max Vertex Tolerance:  0.0000001344348549

Opinions? Thoughts?
 

Quaoar

Administrator
Staff member
Tolerances are a huge problem, indeed. My approach to tighten them was to remove 3D curves of edges and reconstruct them back from the p-curves if any are available. At the same time, this approach was never good enough and I did not ever rely on it.

Code:
void asiAlgo_Utils::RebuildBounds(TopoDS_Shape& shape)
{
  for ( TopExp_Explorer exp(shape, TopAbs_EDGE); exp.More(); exp.Next() )
  {
    const TopoDS_Edge&                               E = TopoDS::Edge( exp.Current() );
    const BRep_TEdge*                               TE = static_cast<const BRep_TEdge*>( E.TShape().get() );
    const BRep_ListOfCurveRepresentation& listOfCurves = TE->Curves();


    // Check if there is at least one p-curve. If not, we cannot
    // reconstruct 3D curve
    bool hasAnyPCurves = false;
    for ( BRep_ListIteratorOfListOfCurveRepresentation cit(listOfCurves); cit.More(); cit.Next() )
    {
      const Handle(BRep_GCurve)&
        fromGC = Handle(BRep_GCurve)::DownCast( cit.Value() );
      //
      if ( fromGC.IsNull() ) continue;
      if ( fromGC->IsCurveOnSurface() )
      {
        hasAnyPCurves = true;
        break;
      }
    }


    if ( hasAnyPCurves )
    {
      // Rebuild 3D representation
      ShapeBuild_Edge().RemoveCurve3d(E);
      ShapeFix_Edge().FixAddCurve3d(E);
    }
  }
}

Your approach looks a bit more elegant. Correct me if I am wrong: you enforce the tolerance to be perfectly small and then run same parameter fix to snap it to a meaningful value, right? I think this should avoid approximation errors at least.

And yes, offsets are not good.
 

blobfish

CAD community veteran
Your approach looks a bit more elegant. Correct me if I am wrong: you enforce the tolerance to be perfectly small and then run same parameter fix to snap it to a meaningful value, right? I think this should avoid approximation errors at least.
Yeah the "tighten.SetTolerance(inShape, Precision::Confusion(), TopAbs_WIRE);" call sets all edge and vertex tolerances to Precision::Confusion. Then the "ShapeFix::SameParameter(inShape, false))" loosens the tolerances to acceptable values. That is the idea anyway. It appears to be working so far. It doesn't appear we need to enforce(second argument to SameParameter call), but would welcome other opinions on that.


Tolerances are a huge problem, indeed. My approach to tighten them was to remove 3D curves of edges and reconstruct them back from the p-curves if any are available. At the same time, this approach was never good enough and I did not ever rely on it.
What was happening with your method that made you give up on it? It looks like it should work.
 

Quaoar

Administrator
Staff member
What was happening with your method that made you give up on it? It looks like it should work.
Well, I did not give up on it, and it was integrated into one of the projects I did for OCC. What I actually meant to say is that it did not work like a charm and did not fix many problems. E.g., among 2k cases, there were some random shapes where it allowed to tighten tolerances a bit. But here I have to precise the workflow:

1. Import IGES.
2. Run sewing with default and big enough tolerance (e.g., 1 mm for 300m-long ship hull).
3. Perform hand-made Boolean cut.
4. Make offset solid.

In this workflow, sewing was doing somewhat similar crap you discover in offsets. But I wouldn't blame it as sewing was initially launched on quite dirty geometry. Then, at the end of the workflow, I tried to improve the tolerances because the next stage was offsetting the base plate, and offsets amplified inaccuracies making them only worse. As the ultimate goal was to mesh this whole thing, and good meshers could tolerate some geometric flaws (and we had a good mesher by Distene), we could survive high tolerances at the end. But I can expect that our final models were not clean enough to undergo any further modeling.

These tolerance issues are typical for OpenCascade and are extremely hard to resolve. A single modeling operator can ruin your shape invisibly and infect all subsequent modeling workflow with incurable issues. The tricks like tolerance tightening could be used to some extent, but I did not see much improvement in adopting them, and they were mostly a "last chance" desperation move to me :D
 

blobfish

CAD community veteran
Well, I did not give up on it, and it was integrated into one of the projects I did for OCC. What I actually meant to say is that it did not work like a charm and did not fix many problems. E.g., among 2k cases, there were some random shapes where it allowed to tighten tolerances a bit. But here I have to precise the workflow:

1. Import IGES.
2. Run sewing with default and big enough tolerance (e.g., 1 mm for 300m-long ship hull).
3. Perform hand-made Boolean cut.
4. Make offset solid.

In this workflow, sewing was doing somewhat similar crap you discover in offsets. But I wouldn't blame it as sewing was initially launched on quite dirty geometry. Then, at the end of the workflow, I tried to improve the tolerances because the next stage was offsetting the base plate, and offsets amplified inaccuracies making them only worse. As the ultimate goal was to mesh this whole thing, and good meshers could tolerate some geometric flaws (and we had a good mesher by Distene), we could survive high tolerances at the end. But I can expect that our final models were not clean enough to undergo any further modeling.
I see. Yes IGES is a mess. The only way I was able to fix iges was to untrim all faces and 're-trim' them to each other. That was difficult to work through using a GUI. I never tried to write code to automate it. I am a big believer in remodeling everything from scratch. Of course that is not always feasible.

These tolerance issues are typical for OpenCascade and are extremely hard to resolve. A single modeling operator can ruin your shape invisibly and infect all subsequent modeling workflow with incurable issues. The tricks like tolerance tightening could be used to some extent, but I did not see much improvement in adopting them, and they were mostly a "last chance" desperation move to me :D
This sounds like gospel to me.(y)
 
Top