Open source C++ library
Recognition and Vision Library
The Recognition and Vision Library (RAVL) provides a base C++ class library together with a range of computer vision, pattern recognition, audio and supporting tools. It is provided under the Lesser GNU Public License and is hosted by SourceForge.
RAVL was originally developed here at the Centre. Subsequently it was moved into the public domain to support its use in a wider community.
Find out more about RAVL (PDF)
Features
Some of the features that set RAVL apart from other C++ libraries are:
- SMP/thread-safe reference counting, allowing easy construction of large programs that takes full advantage of multiprocessor servers
- Powerful I/O mechanism, allowing issues for file formats and type conversion to be handled transparently, separately from the main code
- JAVA-like class interfaces which largely avoid the direct use of pointers, allowing code to be written in a clear, readable style
- Easy-to-use and powerful make system suitable for building both large and small projects.
Supported platforms
RAVL is written in ANSI C++ and is intended to work on a wide range of platforms and compilers. Currently it is actively maintained under:
Operating system | Processor | Compiler |
---|---|---|
Linux | i386 | GNU gcc v. 4.4.3 |
Windows | i386 | Visual Studio 2005 |
In the past it was also maintained under these platforms:
Operating system | Processor | Compiler |
---|---|---|
Solaris | Sparc | GNU gcc v. 3.3 |
IRIX | Mips | MIPS Pro |
Users
RAVL is being used by a (small) number of organisations:
Frequently asked questions
Please see the RAVL FAQs if you have any questions about this resource
Acknowledgements
RAVL was originally derived from AMMA, written by Radek Marik with help from many other members of CVSSP. The work of porting AMMA to RAVL was largely undertaken by Charles Galambos, again with help from other members of CVSSP. The RavlMath library includes ccmath, written by Daniel A. Atkinson. RAVL is currently maintained by members of CVSSP and Omniperception.