-
Notifications
You must be signed in to change notification settings - Fork 196
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Bindings: Use swig_add_library instead of swig_add_module
This is now in YCM UseSWIG version
- Loading branch information
Showing
1 changed file
with
33 additions
and
11 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
86b118c
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @drdanz,
I get the following error:
Unknown CMake command "swig_add_library".
Where does the
swig_add_library
come from?Best, Tobi
86b118c
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Tobias-Fischer it is in
UseSWIG.cmake
incmake/ycm-0.2/cmake-next/
If it is not used there might be a problem.
I need a few information to debug this:
86b118c
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi,
Thanks! Here the debug info:
YCM_USE_CMAKE_NEXT
is ON)./yarp/bindings/build_python
)86b118c
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Mhh, I wonder how the external build is working after ba16979 ... ?_?
If you don't have some use cases for which you strongly need that behaviour, I think we should just remove the "separate build" support for building bindings, also from documentation, see #318 .
86b118c
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@traversaro To be honest I haven't tested it yet, and it is not tested by any unit test... I just remembered about that when I saw @Tobias-Fischer comment
I agree that we should remove the "separate build" support, but I think it is still there to support binary releases that come without the bindings
@Tobias-Fischer Can you try re-building yarp and enable the binding there instead of using a separate build?
86b118c
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @traversaro @drdanz,
Thanks for the hint. Yes, it compiles properly if I do it directly within
yarp/build
. However, when I try toimport yarp
, it fails:Although it is installed:
However, I don't have
/usr/local/src/robot/install/yarp/lib/x86_64-linux-gnu/
in myLD_LIBRARY_PATH
. Do I need to do this now? On another machine which has the path in itsLD_LIBRARY_PATH
, it all works fine.86b118c
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why is it still in
x86_64-linux-gnu
? That should no longer happen... Are you sure that they are not left by some older installation?Anyway,
_yarp.so
version are you using, the one in the build directory or the one installed?CMAKE_SKIP_INSTALL_RPATH
and/orCMAKE_SKIP_RPATH
in CMake?objdump -x <lib/python/_yarp.so> | grep RPATH
86b118c
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually I just noticed that the RPATH is fixed only for lua and perl, so yes, I think you have to add the path to
LD_LIBRARY_PATH
, even though I don't think the path is supposed to havex86_64-linux-gnu
in it...86b118c
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi,
I'm using the installed
_yarp.so
. BothCMAKE_SKIP_INSTALL_RPATH
andCMAKE_SKIP_RPATH
areOFF
.Output of
objdump
:RPATH $ORIGIN/:$ORIGIN/../lib/x86_64-linux-gnu:$ORIGIN/../:/usr/local/cuda/lib64
86b118c
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ahhh I see. Shall I open an issue to fix the RPATH for the other bindings as well?
86b118c
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, now I definitely believe there's an issue with this
x86_64-linux-gnu
.Is this a very old build or a new one? Can you try to remove the
CMAKE_INSTALL_LIBDIR
variable from ccmake (select it and hit 'd') and see how it is re-created? You should seelib/x86_64-linux-gnu
, if it is working correctly it should become justlib
.86b118c
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've removed the
build
folder and recompiled (and installed) from scratch. The output ofobjdump
is now:RPATH $ORIGIN/:$ORIGIN/../lib:$ORIGIN/../
. I still get the errorImportError: libYARP_OS.so.1: cannot open shared object file: No such file or directory
but I guess this is expected if the RPATH is not fixed for python yet.86b118c
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, thanks, I'm a bit less worried now 😜
I will continue the discussion on the PR.
86b118c
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just ran into the very same issue as @Tobias-Fischer above when trying to compile the Python bindings separately from within
$YARP_ROOT/bindings/build
. So far I couldn't fix the build, even after installing YCM and hinting cmake manually to its installation directory. I also tried to copy the relevant *.cmake files from YCM in to YARP's cmake dir in the hope it would pick them up but that didn't do any good neither.I can use the python bindings generated by global build and located in
$YARP_DIR/lib/python2.7/dist-packages
, but the difference to using separately compiled bindings appears to be that the generated yarp python module doesn't contain the comments necessary for generating the interface documentation with pydoc.When I run pydoc on that yarp.py it just tells me that no Python documentation was found.
Am I doing something wrong?
Also: is there a more elegant way to let YARP know where YCM is installed like via some environment variable akin to $YARP_ROOT or $OPENCV_ROOT, etc.? I tried to locate the cmake resource finder for YCM in YARP but couldn't find it, so I couldn't inspect whether it reads out some environment variable.
In my current built I figured out by trial and error to set
YCM_DIR
in YARP's cmake to$YCM_DIR/share/YCM/cmake
, with $YCM_DIR referring to my YCM installation directory as specified in in cmake. But that doesn't solve the "Unkown cmake command" cmake error when activating the Python bindings for building.Note, that I use a non-standard installation directory in the `make install' step of the YCM built, so I need to tell the YARP and icub-main builds manually where my YDM installation directory is located.
Side note: Prior to doing this I upgraded (git pull'ed) YARP, icub-main, ycm, and built the whole batch anew (including deleting all old built artifacts prior to building anew).
86b118c
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I made a mistake when calling pydoc, so the interface documentation can be generated after all.
I ran
pydoc -w yarp.py
instead ofpydoc -w yarp
. Instead of a warning you get the above message that no documentation was found.