Skip navigation
LOCATION Search

Products

V-Guide™ 

A complete 3D robot guidance solution

Proprietary 3D sensors, proprietary software, and industrial hardware

Robot mounted sensors see and adjust to the unique pose of each part, rack, stack, pallet, etc.



Automate tasks that are impossible for unguided robots to perform


Racking / De-Racking
The shape and position of racks are inconsistent. Parts get loaded incorrectly or shift during transport.

 


Seam Sealing Doors
Variations in door-to-door position & orientation are inevitable, and fixturing doors on a line isn't possible. 



Precision Pick & Transfer
Pick the part with precision to place with precision.  Eliminate extra steps & centering fixtures.


 

Fast
From image to robot adjustment in under a second


Robot-mounted vision guidance systems typically require 3 or more measurements to infer a part offset in 6 degrees of freedom. 

With V-Guide™, volumetric sensors use one image to map a part’s 3D surfaces, compare it to a reference, and provide an offset that the robot uses to adjust - all in less than a second. 

The result is clear: 2 fewer images + 2 fewer robot movements = 8 seconds of cycle time savings. 

With cycle time savings of this magnitude, V-Guide™ pays for itself in a matter of days.


 

Versatile
V-Guide™ outperforms the competition

No external lighting required.
Immune to factory lighting.

Other systems fault with any lighting changes

One system for many part styles.
Add / modify styles with ease.

Other systems require additional sensors & lights

Calculates offsets with large
part-to-part shifts & rotations.

Other systems fault with 10mm shifts, 1-2° rotation

Simple setup. Simple operation.
Easy to learn and use.

Other systems require frequent paid support from supplier

 

Innovative
An original approach for superior 3D robot guidance

V-Guide™ combines proprietary 3D sensors & advanced surface matching algorithms developed by Liberty Reach to create a fast, versatile, and innovative robot guidance system that is unmatched for many applications.

 


V-Guide™ uses 1 or 2 compact sensors mounted to the robotic end-of-arm tooling.

 

V-Guide™ - How it works

Please request a demo to view videos and live demonstrations showing how V-Guide works in detail.

We don't want to share our innovations with the whole world, but we're happy to show & tell in private.

 

 

Additional V-Guide™ benefit - quality check

V-Guide™ will generate an alert if the shape of the sample part is significantly different from the reference.  When this occurs, the part is removed from the line for additional inspection before V-Guide™ is reset.  This alert prevents a bad part from making it further downstream, and in many instances prevents the tooling tip from colliding with the part.

V-Guide™ will also generate an alert when a part has shifted or rotated too much.  An excessive offset could cause the robot to collide with a nearby object, such as the part rack.  The out-of-bounds alert will prevent this collision, and gives operators the chance to re-position the out-of-tolerance part before resetting V-Guide™.

 

 

The outdated way to do 3D robot guidance
Using multiple 2D cameras and lights to infer 6 DoF offsets

Feature based offsets - how it works

  • 2D cameras & lights capture images of three or more features (holes, edges, etc) on a part in a known position
  • The same features are imaged on each subsequent part using the same camera positions & lights
  • Software measures the change in 2D images to infer the shift and rotation of each object relative to the known position in 6 degrees of freedom


Downsides of feature based guidance

  • Highly dependent on lighting: each camera has it's own dedicated light source.  Flashing lights in the factory will disrupt the measurement and cause a fault
  • Extra hardware: a system consists of one up to potentially dozens of sets of 2D camera and light source
  • Inflexible: adding a new style to the cell requires a full new set of cameras and lights to focus on the new style's holes / features
  • Slow: when mounted on a robot, the cameras must move to at least 3 positions - this typically adds 5-10 seconds of cycle time
  • Difficult: feature based guidance systems are notoriously difficult to setup, calibrate, and troubleshoot.  When problems occur, or the system needs to be modified, you'll have to rely on the supplier to come on-site to help - costing time and money that you don't have
  • Narrow field of view: part shifts of 10-15mm or rotations of 2-3° move the feature from the field of view, causing a system fault
  • A part without holes typically can't be measured

V-Guide™ volumetric sensors take a single image to define an objects' surfaces in a 3D point cloud.

A 3D point cloud defines the shape, location, and pose of the object in 6 degrees of freedom