From batuhan.osmanoglu at nasa.gov Fri Mar 11 17:19:15 2016 From: batuhan.osmanoglu at nasa.gov (OSMANOGLU, BATUHAN (GSFC-6180)) Date: Fri, 11 Mar 2016 22:19:15 +0000 Subject: [Kwiver-users] Starting with MapTK Message-ID: Hi all, I have an airborne dataset collected over forested areas. Looking into Surface/Structure From Motion algorithms I cam across MapTK and would like to give it a shot. We are interested in getting tree heights? I couldn?t get access to the forge.mil, so I am working with the public version at the moment. Not sure if there are differences. I couldn?t find a tutorial on MapTK (I guess Matt is working on it), but reading some of the emails on this list, I think I will have to do: * Extract frames from the video (ffmpeg etc.) * Use the config files under /tools/config as template This is where it gets fuzzy ;) * What is the order/sequence of the MapTK commands? * maptk_track_features first etc.. * How do I enter the camera position (or plane position info?) * What format does it need etc? * Tools for displaying the point cloud? * Or do we get a file that has to be opened with python displayed in Mayavi etc? I appreciate any suggestions, assistance ;) Oh and currently I am on MacOS, though can switch to others if there is a preferred environment. All the best, batu. -- Batuhan Osmanoglu, Ph.D. AST, Physical Research Scientist NASA Goddard Space Flight Center Code 618.0 P: +1 (301) 614 6690 -------------- next part -------------- An HTML attachment was scrubbed... URL: From matt.leotta at kitware.com Mon Mar 14 10:10:22 2016 From: matt.leotta at kitware.com (Matthew Leotta) Date: Mon, 14 Mar 2016 10:10:22 -0400 Subject: [Kwiver-users] Starting with MapTK In-Reply-To: References: Message-ID: <6E0A6AFA-1620-4C1C-B51C-7191AA622D02@kitware.com> Batu, I?ll address your questions inline below. > On Mar 11, 2016, at 5:19 PM, OSMANOGLU, BATUHAN (GSFC-6180) wrote: > > Hi all, > > I have an airborne dataset collected over forested areas. Looking into Surface/Structure From Motion algorithms I cam across MapTK and would like to give it a shot. We are interested in getting tree heights? > > I couldn?t get access to the forge.mil, so I am working with the public version at the moment. Not sure if there are differences. None of MAP-Tk is on forge.mil. However, some other parts of KWIVER have a component on forge.mil. We do have an internal development branch of MAP-Tk that is not public. This is where we do much of the development sponsored by the Air Force. Every three months we request approval for that code from the Air Force and once we get approval we fold that code back into the public Github master branch. There is a large chunk of code pending approval right now. I expect approval to come through any day now. Once approved I will push the changes to Github and shortly after that I?ll be releasing MAP-Tk v0.8.0. If you can wait a couple weeks, you?ll find MAP-Tk v0.8.0 to be considerably easier to configure. > > I couldn?t find a tutorial on MapTK (I guess Matt is working on it), but reading some of the emails on this list, I think I will have to do: > Extract frames from the video (ffmpeg etc.) > Use the config files under /tools/config as template Correct. A tutorial will be coming out (pending Air Force approval) in April as a blog post and in the April edition of the Kitware Source (http://www.kitware.com/media/thesource.html ). This tutorial will come with some sample data and configuration files. It will focus on MAP-Tk v0.8.0. For now you do need to extract frames as images using FFmpeg or another tool. That will probably not change until MAP-Tk v0.9.0 later this year. In MAP-Tk v0.7.x you should use the config files in tools/config as a template. These files can be a bit unwieldy due to the number of nested algorithms. MAP-Tk v0.8.0 will support modular config files which can be included from other config files. A default set of config files will be installed with the software. This will make the top-level configuration much simpler. You will be able to include the default config files to get the default algorithms and parameters. > This is where it gets fuzzy ;) > What is the order/sequence of the MapTK commands? > maptk_track_features first etc.. The primary two commands are maptk_track_features and maptk_bundle_adjust_tracks, run in that order. The first takes your image sequence and produces a feature track file. The second takes the feature track file and estimates camera parameters and 3D landmarks. > > How do I enter the camera position (or plane position info?) > What format does it need etc? MAP-Tk can reconstruct the camera positions and landmarks without any prior position info, but the solution is only up to an unknown similarity transform (global scale, orientation, and geo-location). If you have metadata about position (e.g. from GPS and IMU) you can use that. The metadata will provide initialization to the solution and also constrain the unknown similarity. Currently the only format we support for the metadata is a format used by the Air Force called POS. You can find an example of POS metadata with the CLIF 2007 data (https://www.sdms.afrl.af.mil/index.php?collection=clif2007 ). There is one POS file per image, and each contains comma separated ASCII values for yaw, pitch, roll, latitude, longitude, altitude, gpsSeconds, gpsWeek, northVel, eastVel, upVel, imuStatus, localAdjustment, dstFlag We currently only use the first 6 fields in this file in MAP-Tk. The rest could be set to zeros. If you have another format for your metadata I be interested to know about it. There is no reason we couldn?t support other formats in the future. The other format we support for cameras is KRTD, another ASCII format which contains a 3x3 calibration matrix (K), a 3x3 rotation matrix (R), a 1x3 translation vector (t), and a 1xN distortion vector (d). For the distortion vector N can between 1 and 8, but you can use a single ?0? to model no radial distortion. The KRTD file represents the cameras relative to some local origin. The KRTD files are not geo-located but can contain absolution scale and orientation. The POS files, on the other hand have geo coordinates, but do not contain any information on camera intrinsic parameters (focal length, distortion, etc.). There is a tool, pos2krtd, that converts POS files to KRTD given also a model of the camera intrinsics. The orientation angles (yaw, pitch, roll) can be tricky to get correct in the POS file if you are making the POS file from another data source. Converting POS to KRTD is a good way to check that the POS files are as expected. The KRTD files can be loaded directly into the MAP-Tk GUI application for viewing. > > Tools for displaying the point cloud? > Or do we get a file that has to be opened with python displayed in Mayavi etc? The point cloud comes out as a PLY file, which is fairly standard and can be viewed in numerous tools. The MAP-Tk GUI make it easy to view both the cameras and 3D point cloud together. In MAP-Tk v0.8.0 you can just ?open? the same config files used on the command line to load the cameras and point cloud. MAP-Tk also comes with scripts to aid importing the results into other third party tools. MAP-Tk includes Python plugins for Blender (https://www.blender.org/ ) to load the KRTD files and Blender natively supports PLY. MAP-Tk v0.8.0 will also provide Ruby plugins for SketchUp (http://www.sketchup.com/ ) to import both KRTD cameras files and the PLY point cloud. > > I appreciate any suggestions, assistance ;) Oh and currently I am on MacOS, though can switch to others if there is a preferred environment. We support MacOS, Linux, and Windows. I do most of my development on MacOS as well, so that should not be a problem. MAP-Tk is still very much a work in progress, so please let us know if you run into trouble. If you are willing to get your hands dirty with the code, pull requests are also welcome for bug fixes and new features. Good luck, Matt -------------- next part -------------- An HTML attachment was scrubbed... URL: From batuhan.osmanoglu at nasa.gov Mon Mar 14 13:56:50 2016 From: batuhan.osmanoglu at nasa.gov (OSMANOGLU, BATUHAN (GSFC-6180)) Date: Mon, 14 Mar 2016 17:56:50 +0000 Subject: [Kwiver-users] Starting with MapTK In-Reply-To: <6E0A6AFA-1620-4C1C-B51C-7191AA622D02@kitware.com> References: <6E0A6AFA-1620-4C1C-B51C-7191AA622D02@kitware.com> Message-ID: Hello Matt, Thanks so much for the detailed answers. I think I will wait for the release of MapTK 0.8.0 and the tutorial. Do you mind sending an email to the group when they are released? The POS data sounds manageable. We had a bunch of different GPS/IMU sensors on board the flight, some of which had vendor specific formats. But for our in-house codes we use ascii as well, so matching the POS format is not a problem. The video I am planning to use was collected through a nadir-looking window of the plane, and I am planning to use the plane?s avionics data for the POS data. Thanks for mentioning the pos2krtd to check for errors. I am expecting some differences between the camera and GPS antenna position? Also I don?t know much about the camera that collected the imagery. I will look into obtaining that to find more about camera distortion etc. The PLY format should be OK. I haven?t played with it much but if blender is opening it that is perfect. It seems like at this point, I don?t need to worry about trying to get access to forge.mil. If we find out that it would be beneficial in the future I will try to contact them again. All the best, Batu. From: Matthew Leotta > Date: Monday, March 14, 2016 at 9:10 AM To: "Osmanoglu, Batuhan (GSFC-618.0)[USRA]" > Cc: "kwiver-users at public.kitware.com" > Subject: Re: [Kwiver-users] Starting with MapTK Batu, I?ll address your questions inline below. On Mar 11, 2016, at 5:19 PM, OSMANOGLU, BATUHAN (GSFC-6180) > wrote: Hi all, I have an airborne dataset collected over forested areas. Looking into Surface/Structure From Motion algorithms I cam across MapTK and would like to give it a shot. We are interested in getting tree heights? I couldn?t get access to the forge.mil, so I am working with the public version at the moment. Not sure if there are differences. None of MAP-Tk is on forge.mil. However, some other parts of KWIVER have a component on forge.mil. We do have an internal development branch of MAP-Tk that is not public. This is where we do much of the development sponsored by the Air Force. Every three months we request approval for that code from the Air Force and once we get approval we fold that code back into the public Github master branch. There is a large chunk of code pending approval right now. I expect approval to come through any day now. Once approved I will push the changes to Github and shortly after that I?ll be releasing MAP-Tk v0.8.0. If you can wait a couple weeks, you?ll find MAP-Tk v0.8.0 to be considerably easier to configure. I couldn?t find a tutorial on MapTK (I guess Matt is working on it), but reading some of the emails on this list, I think I will have to do: * Extract frames from the video (ffmpeg etc.) * Use the config files under /tools/config as template Correct. A tutorial will be coming out (pending Air Force approval) in April as a blog post and in the April edition of the Kitware Source (http://www.kitware.com/media/thesource.html). This tutorial will come with some sample data and configuration files. It will focus on MAP-Tk v0.8.0. For now you do need to extract frames as images using FFmpeg or another tool. That will probably not change until MAP-Tk v0.9.0 later this year. In MAP-Tk v0.7.x you should use the config files in tools/config as a template. These files can be a bit unwieldy due to the number of nested algorithms. MAP-Tk v0.8.0 will support modular config files which can be included from other config files. A default set of config files will be installed with the software. This will make the top-level configuration much simpler. You will be able to include the default config files to get the default algorithms and parameters. This is where it gets fuzzy ;) * What is the order/sequence of the MapTK commands? * maptk_track_features first etc.. The primary two commands are maptk_track_features and maptk_bundle_adjust_tracks, run in that order. The first takes your image sequence and produces a feature track file. The second takes the feature track file and estimates camera parameters and 3D landmarks. * * How do I enter the camera position (or plane position info?) * What format does it need etc? MAP-Tk can reconstruct the camera positions and landmarks without any prior position info, but the solution is only up to an unknown similarity transform (global scale, orientation, and geo-location). If you have metadata about position (e.g. from GPS and IMU) you can use that. The metadata will provide initialization to the solution and also constrain the unknown similarity. Currently the only format we support for the metadata is a format used by the Air Force called POS. You can find an example of POS metadata with the CLIF 2007 data (https://www.sdms.afrl.af.mil/index.php?collection=clif2007). There is one POS file per image, and each contains comma separated ASCII values for yaw, pitch, roll, latitude, longitude, altitude, gpsSeconds, gpsWeek, northVel, eastVel, upVel, imuStatus, localAdjustment, dstFlag We currently only use the first 6 fields in this file in MAP-Tk. The rest could be set to zeros. If you have another format for your metadata I be interested to know about it. There is no reason we couldn?t support other formats in the future. The other format we support for cameras is KRTD, another ASCII format which contains a 3x3 calibration matrix (K), a 3x3 rotation matrix (R), a 1x3 translation vector (t), and a 1xN distortion vector (d). For the distortion vector N can between 1 and 8, but you can use a single ?0? to model no radial distortion. The KRTD file represents the cameras relative to some local origin. The KRTD files are not geo-located but can contain absolution scale and orientation. The POS files, on the other hand have geo coordinates, but do not contain any information on camera intrinsic parameters (focal length, distortion, etc.). There is a tool, pos2krtd, that converts POS files to KRTD given also a model of the camera intrinsics. The orientation angles (yaw, pitch, roll) can be tricky to get correct in the POS file if you are making the POS file from another data source. Converting POS to KRTD is a good way to check that the POS files are as expected. The KRTD files can be loaded directly into the MAP-Tk GUI application for viewing. * * Tools for displaying the point cloud? * Or do we get a file that has to be opened with python displayed in Mayavi etc? The point cloud comes out as a PLY file, which is fairly standard and can be viewed in numerous tools. The MAP-Tk GUI make it easy to view both the cameras and 3D point cloud together. In MAP-Tk v0.8.0 you can just ?open? the same config files used on the command line to load the cameras and point cloud. MAP-Tk also comes with scripts to aid importing the results into other third party tools. MAP-Tk includes Python plugins for Blender (https://www.blender.org/) to load the KRTD files and Blender natively supports PLY. MAP-Tk v0.8.0 will also provide Ruby plugins for SketchUp (http://www.sketchup.com/) to import both KRTD cameras files and the PLY point cloud. * I appreciate any suggestions, assistance ;) Oh and currently I am on MacOS, though can switch to others if there is a preferred environment. We support MacOS, Linux, and Windows. I do most of my development on MacOS as well, so that should not be a problem. MAP-Tk is still very much a work in progress, so please let us know if you run into trouble. If you are willing to get your hands dirty with the code, pull requests are also welcome for bug fixes and new features. Good luck, Matt -------------- next part -------------- An HTML attachment was scrubbed... URL: From matt.leotta at kitware.com Mon Mar 14 14:30:16 2016 From: matt.leotta at kitware.com (Matthew Leotta) Date: Mon, 14 Mar 2016 14:30:16 -0400 Subject: [Kwiver-users] Starting with MapTK In-Reply-To: References: <6E0A6AFA-1620-4C1C-B51C-7191AA622D02@kitware.com> Message-ID: > On Mar 14, 2016, at 1:56 PM, OSMANOGLU, BATUHAN (GSFC-6180) wrote: > > Hello Matt, > > Thanks so much for the detailed answers. I think I will wait for the release of MapTK 0.8.0 and the tutorial. Do you mind sending an email to the group when they are released? I certainly will make an announcement when these are available. > > The POS data sounds manageable. We had a bunch of different GPS/IMU sensors on board the flight, some of which had vendor specific formats. But for our in-house codes we use ascii as well, so matching the POS format is not a problem. > > The video I am planning to use was collected through a nadir-looking window of the plane, and I am planning to use the plane?s avionics data for the POS data. Thanks for mentioning the pos2krtd to check for errors. I am expecting some differences between the camera and GPS antenna position? Also I don?t know much about the camera that collected the imagery. I will look into obtaining that to find more about camera distortion etc. You don?t necessarily need to know everything about the camera. In theory you can estimate everything from the data. However, in practice you should use whatever information you have about camera intrinsics, at least for initialization purposes. This is especially true if you have a very narrow field of view (high zoom) lens and if there is substantial radial distortion. You can tell MAP-Tk to estimate any subset of the camera intrinsic parameters, but convergence may slow or could even fail if you have too many unknowns. There is also a know degeneracy when estimating radial distortion from a NADIR looking camera [1]. As a first pass I would set skew to zero, use the center of the image as the principal point, and use zero distortion (unless you see noticeable distortion in the video). You can have MAP-Tk estimate the focal length of the lens, but a reasonable guess for initialization can help considerable. Usually you can figure this out from the specs on the camera sensor and lens that were used. [1] http://ccwu.me/file/radial.pdf > > The PLY format should be OK. I haven?t played with it much but if blender is opening it that is perfect. > > It seems like at this point, I don?t need to worry about trying to get access to forge.mil. If we find out that it would be beneficial in the future I will try to contact them again. > > All the best, > Batu. > > From: Matthew Leotta > > Date: Monday, March 14, 2016 at 9:10 AM > To: "Osmanoglu, Batuhan (GSFC-618.0)[USRA]" > > Cc: "kwiver-users at public.kitware.com " > > Subject: Re: [Kwiver-users] Starting with MapTK > > Batu, > > I?ll address your questions inline below. > >> On Mar 11, 2016, at 5:19 PM, OSMANOGLU, BATUHAN (GSFC-6180) > wrote: >> >> Hi all, >> >> I have an airborne dataset collected over forested areas. Looking into Surface/Structure From Motion algorithms I cam across MapTK and would like to give it a shot. We are interested in getting tree heights? >> >> I couldn?t get access to the forge.mil, so I am working with the public version at the moment. Not sure if there are differences. > > None of MAP-Tk is on forge.mil. However, some other parts of KWIVER have a component on forge.mil. We do have an internal development branch of MAP-Tk that is not public. This is where we do much of the development sponsored by the Air Force. Every three months we request approval for that code from the Air Force and once we get approval we fold that code back into the public Github master branch. There is a large chunk of code pending approval right now. I expect approval to come through any day now. Once approved I will push the changes to Github and shortly after that I?ll be releasing MAP-Tk v0.8.0. If you can wait a couple weeks, you?ll find MAP-Tk v0.8.0 to be considerably easier to configure. > >> >> I couldn?t find a tutorial on MapTK (I guess Matt is working on it), but reading some of the emails on this list, I think I will have to do: >> Extract frames from the video (ffmpeg etc.) >> Use the config files under /tools/config as template > Correct. A tutorial will be coming out (pending Air Force approval) in April as a blog post and in the April edition of the Kitware Source (http://www.kitware.com/media/thesource.html ). This tutorial will come with some sample data and configuration files. It will focus on MAP-Tk v0.8.0. > > For now you do need to extract frames as images using FFmpeg or another tool. That will probably not change until MAP-Tk v0.9.0 later this year. In MAP-Tk v0.7.x you should use the config files in tools/config as a template. These files can be a bit unwieldy due to the number of nested algorithms. MAP-Tk v0.8.0 will support modular config files which can be included from other config files. A default set of config files will be installed with the software. This will make the top-level configuration much simpler. You will be able to include the default config files to get the default algorithms and parameters. > > >> This is where it gets fuzzy ;) >> What is the order/sequence of the MapTK commands? >> maptk_track_features first etc.. > The primary two commands are maptk_track_features and maptk_bundle_adjust_tracks, run in that order. The first takes your image sequence and produces a feature track file. The second takes the feature track file and estimates camera parameters and 3D landmarks. > >> >> How do I enter the camera position (or plane position info?) >> What format does it need etc? > MAP-Tk can reconstruct the camera positions and landmarks without any prior position info, but the solution is only up to an unknown similarity transform (global scale, orientation, and geo-location). If you have metadata about position (e.g. from GPS and IMU) you can use that. The metadata will provide initialization to the solution and also constrain the unknown similarity. Currently the only format we support for the metadata is a format used by the Air Force called POS. You can find an example of POS metadata with the CLIF 2007 data (https://www.sdms.afrl.af.mil/index.php?collection=clif2007 ). There is one POS file per image, and each contains comma separated ASCII values for > > yaw, pitch, roll, latitude, longitude, altitude, gpsSeconds, gpsWeek, northVel, eastVel, upVel, imuStatus, localAdjustment, dstFlag > > We currently only use the first 6 fields in this file in MAP-Tk. The rest could be set to zeros. If you have another format for your metadata I be interested to know about it. There is no reason we couldn?t support other formats in the future. The other format we support for cameras is KRTD, another ASCII format which contains a 3x3 calibration matrix (K), a 3x3 rotation matrix (R), a 1x3 translation vector (t), and a 1xN distortion vector (d). For the distortion vector N can between 1 and 8, but you can use a single ?0? to model no radial distortion. The KRTD file represents the cameras relative to some local origin. The KRTD files are not geo-located but can contain absolution scale and orientation. The POS files, on the other hand have geo coordinates, but do not contain any information on camera intrinsic parameters (focal length, distortion, etc.). > > There is a tool, pos2krtd, that converts POS files to KRTD given also a model of the camera intrinsics. The orientation angles (yaw, pitch, roll) can be tricky to get correct in the POS file if you are making the POS file from another data source. Converting POS to KRTD is a good way to check that the POS files are as expected. The KRTD files can be loaded directly into the MAP-Tk GUI application for viewing. > >> >> Tools for displaying the point cloud? >> Or do we get a file that has to be opened with python displayed in Mayavi etc? > > The point cloud comes out as a PLY file, which is fairly standard and can be viewed in numerous tools. The MAP-Tk GUI make it easy to view both the cameras and 3D point cloud together. In MAP-Tk v0.8.0 you can just ?open? the same config files used on the command line to load the cameras and point cloud. MAP-Tk also comes with scripts to aid importing the results into other third party tools. MAP-Tk includes Python plugins for Blender (https://www.blender.org/ ) to load the KRTD files and Blender natively supports PLY. MAP-Tk v0.8.0 will also provide Ruby plugins for SketchUp (http://www.sketchup.com/ ) to import both KRTD cameras files and the PLY point cloud. > >> >> I appreciate any suggestions, assistance ;) Oh and currently I am on MacOS, though can switch to others if there is a preferred environment. > > We support MacOS, Linux, and Windows. I do most of my development on MacOS as well, so that should not be a problem. > > MAP-Tk is still very much a work in progress, so please let us know if you run into trouble. If you are willing to get your hands dirty with the code, pull requests are also welcome for bug fixes and new features. > > Good luck, > Matt > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Kyle.Brooks at L-3Com.com Mon Mar 21 15:36:27 2016 From: Kyle.Brooks at L-3Com.com (Kyle.Brooks at L-3Com.com) Date: Mon, 21 Mar 2016 19:36:27 +0000 Subject: [Kwiver-users] Map-TK Linker Errors on Windows Message-ID: Hello, I installed Map-TK v0.7.2 from the installer and am trying to use "maptk::read_krtd_file" and "maptk::read_pos_file". I have added Map-TK to my project (Portions of file removed): cmake_minimum_required(VERSION 3.1 FATAL_ERROR) project(${MY_PROJECT}) find_package(maptk REQUIRED) add_executable(${EXECUTABLE_NAME} ${SOURCES}) target_include_directories(${EXECUTABLE_NAME} PRIVATE include ${MAPTK_INCLUDE_DIRS}) target_link_libraries(${EXECUTABLE_NAME} ${MAPTK_LIBRARIES}) message(STATUS "MAPTK_LIBRARIES - ${MAPTK_LIBRARIES}") I am getting two linker errors using Visual Studio 2010 SP1, it compiles just fine. Error 1 error LNK2019: unresolved external symbol "__declspec(dllimport) class maptk::camera_ __cdecl maptk::read_krtd_file(class boost::filesystem3::path const &)" (__imp_?read_krtd_file at maptk@@YA?AV?$camera_ at N@1 at AEBVpath@filesystem3 at boost@@@Z) referenced in function "class std::vector,class std::allocator > > __cdecl Foo(class std::basic_string,class std::allocator > const &)" (?Foo@@YA?AV?$vector at V?$Transform at M$02$01$0A@@Eigen@@V?$allocator at V?$Transform at M$02$01$0A@@Eigen@@@std@@@std@@AEBV?$basic_string at DU?$char_traits at D@std@@V?$allocator at D@2@@2@@Z) foobar.obj Error 2 error LNK2019: unresolved external symbol "__declspec(dllimport) struct maptk::ins_data __cdecl maptk::read_pos_file(class boost::filesystem3::path const &)" (__imp_?read_pos_file at maptk@@YA?AUins_data at 1@AEBVpath at filesystem3@boost@@@Z) referenced in function "class std::vector,class std::allocator > > __cdecl Bar(class std::basic_string,class std::allocator > const &)" (?Bar@@YA?AV?$vector at V?$Transform at M$02$01$0A@@Eigen@@V?$allocator at V?$Transform at M$02$01$0A@@Eigen@@@std@@@std@@AEBV?$basic_string at DU?$char_traits at D@std@@V?$allocator at D@2@@2@@Z) foobar.obj Is this library missing or not matching the header files? Any ideas? Thanks, Kyle From matt.leotta at kitware.com Mon Mar 21 17:06:46 2016 From: matt.leotta at kitware.com (Matthew Leotta) Date: Mon, 21 Mar 2016 17:06:46 -0400 Subject: [Kwiver-users] Map-TK Linker Errors on Windows In-Reply-To: References: Message-ID: <10002E69-F8D1-4EFB-BB66-B526A753E959@kitware.com> Kyle, I don?t know what the issues is off hand. For the v0.7.x installers we focused on making sure the executables and GUI could be installed and run. While the package does included header file and and libraries, we didn?t test its use as an SDK. I suspect there is something not configured correctly in the installed package. If you want to develop code against MAP-Tk, you may have better luck if you build from source. I don?t have time to look at this issue with the installer anytime soon. I?m preparing for the upcoming v0.8.0 release. If this issue persists with the v0.8.0 release then we will probably address it there. However, we are moving to C++11 with MAP-Tk v0.8.0, which means dropping support for Visual Studio 2010. So that may be an issue for you. We are continuing to support Visual Studio 2013 and above. ?Matt > On Mar 21, 2016, at 3:36 PM, Kyle.Brooks at L-3Com.com wrote: > > Hello, > > I installed Map-TK v0.7.2 from the installer and am trying to use "maptk::read_krtd_file" and "maptk::read_pos_file". I have added Map-TK to my project (Portions of file removed): > > cmake_minimum_required(VERSION 3.1 FATAL_ERROR) > project(${MY_PROJECT}) > find_package(maptk REQUIRED) > > add_executable(${EXECUTABLE_NAME} ${SOURCES}) > target_include_directories(${EXECUTABLE_NAME} PRIVATE include ${MAPTK_INCLUDE_DIRS}) > target_link_libraries(${EXECUTABLE_NAME} ${MAPTK_LIBRARIES}) > > message(STATUS "MAPTK_LIBRARIES - ${MAPTK_LIBRARIES}") > > I am getting two linker errors using Visual Studio 2010 SP1, it compiles just fine. > > Error 1 error LNK2019: unresolved external symbol "__declspec(dllimport) class maptk::camera_ __cdecl maptk::read_krtd_file(class boost::filesystem3::path const &)" (__imp_?read_krtd_file at maptk@@YA?AV?$camera_ at N@1 at AEBVpath@filesystem3 at boost@@@Z) referenced in function "class std::vector,class std::allocator > > __cdecl Foo(class std::basic_string,class std::allocator > const &)" (?Foo@@YA?AV?$vector at V?$Transform at M$02$01$0A@@Eigen@@V?$allocator at V?$Transform at M$02$01$0A@@Eigen@@@std@@@std@@AEBV?$basic_string at DU?$char_traits at D@std@@V?$allocator at D@2@@2@@Z) foobar.obj > Error 2 error LNK2019: unresolved external symbol "__declspec(dllimport) struct maptk::ins_data __cdecl maptk::read_pos_file(class boost::filesystem3::path const &)" (__imp_?read_pos_file at maptk@@YA?AUins_data at 1@AEBVpath at filesystem3@boost@@@Z) referenced in function "class std::vector,class std::allocator > > __cdecl Bar(class std::basic_string,class std::allocator > const &)" (?Bar@@YA?AV?$vector at V?$Transform at M$02$01$0A@@Eigen@@V?$allocator at V?$Transform at M$02$01$0A@@Eigen@@@std@@@std@@AEBV?$basic_string at DU?$char_traits at D@std@@V?$allocator at D@2@@2@@Z) foobar.obj > > Is this library missing or not matching the header files? Any ideas? > > Thanks, > Kyle > > _______________________________________________ > Kwiver-users mailing list > Kwiver-users at public.kitware.com > http://public.kitware.com/mailman/listinfo/kwiver-users