Our top priority

We are as proud of our service as we are of our robots.

  • SDK Examples
Forgot your password?
Register

Splash Forums Development SDK SDK Examples

This topic contains 5 replies, has 3 voices, and was last updated by  phaydens 1 year, 3 months ago.

Viewing 6 posts - 1 through 6 (of 6 total)
  • Author
    Posts
  • #2071

    phaydens
    Participant

    Hi, There.

    I have a problem trying to run the example code called WindowsExample_CartesianControl using Visual Studio 2013. It seems that the file CommandLayerWindows.dll is not loaded so that all the function pointers have null value. I got a message: ERROR DURING INITIALIZATION. However, it works When I use the Kinova development Center.

    In addition, Im trying to figure out the linkers used in the code. I only found this one:

    C:\Workspace\api_cpp\Examples.

    The code is presented below:

    #include <Windows.h>
    #include “Lib_Examples\CommunicationLayerWindows.h”
    #include “Lib_Examples\CommandLayer.h”
    #include <conio.h>
    #include “Lib_Examples\KinovaTypes.h”
    #include <iostream>

    using namespace std;

    //A handle to the API.
    HINSTANCE commandLayer_handle;

    //Function pointers to the functions we need
    int(*MyInitAPI)();
    int(*MyCloseAPI)();
    int(*MySendBasicTrajectory)(TrajectoryPoint command);
    int(*MyGetDevices)(KinovaDevice devices[MAX_KINOVA_DEVICE], int &result);
    int(*MySetActiveDevice)(KinovaDevice device);
    int(*MyMoveHome)();
    int(*MyInitFingers)();
    int(*MyGetCartesianCommand)(CartesianPosition &);

    int main(int argc, char* argv[])
    {
    //We load the API.
    commandLayer_handle = LoadLibrary(L”CommandLayerWindows.dll”);

    CartesianPosition currentCommand;

    int programResult = 0;

    //We load the functions from the library (Under Windows, use GetProcAddress)
    MyInitAPI = (int(*)()) GetProcAddress(commandLayer_handle, “InitAPI”);
    MyCloseAPI = (int(*)()) GetProcAddress(commandLayer_handle, “CloseAPI”);
    MyMoveHome = (int(*)()) GetProcAddress(commandLayer_handle, “MoveHome”);
    MyInitFingers = (int(*)()) GetProcAddress(commandLayer_handle, “InitFingers”);
    MyGetDevices = (int(*)(KinovaDevice devices[MAX_KINOVA_DEVICE], int &result)) GetProcAddress(commandLayer_handle, “GetDevices”);
    MySetActiveDevice = (int(*)(KinovaDevice devices)) GetProcAddress(commandLayer_handle, “SetActiveDevice”);
    MySendBasicTrajectory = (int(*)(TrajectoryPoint)) GetProcAddress(commandLayer_handle, “SendBasicTrajectory”);
    MyGetCartesianCommand = (int(*)(CartesianPosition &)) GetProcAddress(commandLayer_handle, “GetCartesianCommand”);

    //Verify that all functions has been loaded correctly
    if ((MyInitAPI == NULL) || (MyCloseAPI == NULL) || (MySendBasicTrajectory == NULL) ||
    (MyGetDevices == NULL) || (MySetActiveDevice == NULL) || (MyGetCartesianCommand == NULL) ||
    (MyMoveHome == NULL) || (MyInitFingers == NULL))

    {
    cout << “* * * E R R O R D U R I N G I N I T I A L I Z A T I O N * * *” << endl;
    programResult = 0;
    }
    else
    {
    cout << “I N I T I A L I Z A T I O N C O M P L E T E D” << endl << endl;

    int result = (*MyInitAPI)();

    cout << “Initialization’s result :” << result << endl;

    KinovaDevice list[MAX_KINOVA_DEVICE];

    int devicesCount = MyGetDevices(list, result);

    for (int i = 0; i < devicesCount; i++)
    {
    cout << “Found a robot on the USB bus (” << list[i].SerialNumber << “)” << endl;

    //Setting the current device as the active device.
    MySetActiveDevice(list[i]);

    cout << “Send the robot to HOME position” << endl;
    MyMoveHome();

    cout << “Initializing the fingers” << endl;
    MyInitFingers();

    TrajectoryPoint pointToSend;
    pointToSend.InitStruct();

    //We specify that this point will be used an angular(joint by joint) velocity vector.
    pointToSend.Position.Type = CARTESIAN_VELOCITY;

    pointToSend.Position.CartesianPosition.X = 0;
    pointToSend.Position.CartesianPosition.Y = -0.15; //Move along Y axis at 20 cm per second
    pointToSend.Position.CartesianPosition.Z = 0;
    pointToSend.Position.CartesianPosition.ThetaX = 0;
    pointToSend.Position.CartesianPosition.ThetaY = 0;
    pointToSend.Position.CartesianPosition.ThetaZ = 0;

    pointToSend.Position.Fingers.Finger1 = 0;
    pointToSend.Position.Fingers.Finger2 = 0;
    pointToSend.Position.Fingers.Finger3 = 0;

    for (int i = 0; i < 200; i++)
    {
    //We send the velocity vector every 5 ms as long as we want the robot to move along that vector.
    MySendBasicTrajectory(pointToSend);
    Sleep(5);
    }

    pointToSend.Position.CartesianPosition.Y = 0;
    pointToSend.Position.CartesianPosition.Z = 0.1;

    for (int i = 0; i < 200; i++)
    {
    //We send the velocity vector every 5 ms as long as we want the robot to move along that vector.
    MySendBasicTrajectory(pointToSend);
    Sleep(5);
    }

    cout << “Send the robot to HOME position” << endl;
    MyMoveHome();

    //We specify that this point will be an angular(joint by joint) position.
    pointToSend.Position.Type = CARTESIAN_POSITION;

    //We get the actual angular command of the robot.
    MyGetCartesianCommand(currentCommand);

    pointToSend.Position.CartesianPosition.X = currentCommand.Coordinates.X;
    pointToSend.Position.CartesianPosition.Y = currentCommand.Coordinates.Y – 0.1f;
    pointToSend.Position.CartesianPosition.Z = currentCommand.Coordinates.Z;
    pointToSend.Position.CartesianPosition.ThetaX = currentCommand.Coordinates.ThetaX;
    pointToSend.Position.CartesianPosition.ThetaY = currentCommand.Coordinates.ThetaY;
    pointToSend.Position.CartesianPosition.ThetaZ = currentCommand.Coordinates.ThetaZ;

    cout << “*********************************” << endl;
    cout << “Sending the first point to the robot.” << endl;
    MySendBasicTrajectory(pointToSend);

    pointToSend.Position.CartesianPosition.Z = currentCommand.Coordinates.Z + 0.1f;
    cout << “Sending the second point to the robot.” << endl;
    MySendBasicTrajectory(pointToSend);

    cout << “*********************************” << endl << endl << endl;
    }

    cout << endl << “C L O S I N G A P I” << endl;
    result = (*MyCloseAPI)();
    programResult = 1;
    }

    FreeLibrary(commandLayer_handle);

    return programResult;

    }

    Thanks for the help Eric.

    V/R
    Pedro.

    #2072

    Anonymous

    Hi,

    Could you validate that you have “CommandLayerWindows.dll” in the same folder as your VCProj and in the build folder (debug or release) ?


    Jean-Philippe Côté, Software Dev at Kinova

    #2074

    phaydens
    Participant

    hi, Jean-Philippe

    There are some *.dll and *.h files contained in the folder Lib_examples, I copied them in the folder WindowsExample_CartesianControl and it WORKS now.

    I’m trying to move the arm by reading new position data contained in a *.txt file.

    Thanks for the valuable help. I hope that more people start to use the forum.

    Pedro.

    #2460

    phaydens
    Participant

    Hi Jean Philippe.

    After some work, I am able to control the Jaco arm using a motion controller. I found some limitations in some positions where the arm just stop to move. May you send me or let me know where can I find these limitations.

    V/R

    Pedro

    #2461

    Alex
    Participant

    Dear Pedro,

    In cartesian control with our robots, there are 3 major limitations:

    1- Physical limitations by the carbon links
    2- Base cylinder singularity
    3- Wrist alignement singularity

    If any of the positions is sent near these limitations, the arm will stop to move once it is too close or has entered these limitations.

    I hope this helps, and if anything, don’t hesitate to let us know!

    Thank you,

    #2462

    phaydens
    Participant

    Thanks for the prompt answer.

    Can I get the values of these limitations?

Viewing 6 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic.

Trending now

News & Events

Kinova’s Vice President of Medical Robotics, Appointed to 2 Boards, 2 Committees

2018 is bound to be a busy year for Kinova’s own Stuart Kozlick. In 2017, Stuart has been appointed to sit on four prestigious committees and boards, effectively positioning Kinova among key industry thought leaders. These strategic partnerships will allow Kinova to accelerate the adoption of new technologies within the healthcare system and encourage alignment within the industry, healthcare communities and governments facilitating accessibilities and communities.

Read more
Humans & Robots

THE NEW NORMAL: UPROOTING ASSISTIVE NORMS in 2018

A new year is a time for new beginnings and starting things afresh. In the spirit of making a change, there’s a specific paradigm that needs to be addressed, or, as many in the industry would say, “disrupted.” Consider 2018 the year of “The New Normal” for the Assistive market. The New Normal is a term coined by our colleague, Abe Clark, in the United States.

Read more
Expert Talk

STEERING TOWARDS CO-CREATION, COLLABORATION, AND INNOVATION – #KINOVAONTHEROAD in 2018

My objective at Kinova is to build bridges between people, businesses, educational organizations, and governments to effectively fuel the cycle of innovation. Unsurprisingly, that means hopping on a plane and going to find the best people in the world to exchange with. This is why the concept of Steering Committees as I came to realize is, and will increasingly be, vital.

Read more