Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How To Use IMU To Get The Change of Roll Angle of L515 When It Is Moving #10903

Closed
FeiSong123 opened this issue Sep 15, 2022 · 44 comments
Closed

Comments

@FeiSong123
Copy link

  • Before opening a new issue, we wanted to provide you with some useful suggestions (Click "Preview" above for a better view):

  • All users are welcomed to report bugs, ask questions, suggest or request enhancements and generally feel free to open new issue, even if they haven't followed any of the suggestions above :)


Required Info
Camera Model L515
Firmware Version 01.05.08.01
Operating System & Version Ubuntu18.04
Kernel Version (Linux Only) 5.4.0-125-generic
Platform PC
SDK Version 2.50.0
Language C++
Segment Robot

Issue Description

<Describe your issue / question / feature request / etc..>
Hi! I'm new in IMU and I have a question on how to use it to ratate my camera's coordinate system, both of 2D and 3D.
My camera is L515 and it is on a robot. I want to give a deflection angle according to the direction of gravity acceleration when the robot is initialized and stationary (I think it is horizontal at this time). After the robot moves, it will have a roll angle. I want to obtain the roll angle of the camera relative to the initial position through IMU, which is used to rotate the camera's coordinate system to make it return to the horizontal position. Can L515 do this?

@MartyG-RealSense
Copy link
Collaborator

Hi @FeiSong123 It may not be practical to rotate the IMU coordinate system, as it is aligned with the depth coordinate system, as described at #6456 (comment) and #8249 (comment)

The coordinate system of the D435i camera model applies to the L515 model too.

If you want to obtain the roll angle of the camera relative to the initial position then perhaps you could store the original roll angle when the camera is at horizontal, then subtract the new angle from the original one to find the angle that the roll has been deflected from the original orientation (new roll - original roll = number of degrees moved through). You should then know how many degrees have to be moved through to return from the new position to the original one.

@FeiSong123
Copy link
Author

No, I don't want to rotate the IMU coordinate system but to rotate color image and xyz values.

@MartyG-RealSense
Copy link
Collaborator

The yaw value should correspond to the Y angle of the gyro, as shown in the C++ rs-motion SDK example program.

https://github.com/IntelRealSense/librealsense/blob/master/examples/motion/rs-motion.cpp#L139

I am not aware though of a method by which the IMU can be used to rotate RealSense color or depth images, unfortunately.

Somebody else asked a similar question on the OpenCV forum at the link below.

https://forum.opencv.org/t/using-imu-roll-pitch-and-yaw-data-with-opencv/7486

@FeiSong123
Copy link
Author

Sorry for my unclear statement. What I want is to obtain the change of roll angle through IMU, and then rotate the coordinate system of the data according to the roll angle.

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Sep 15, 2022

Whilst you should certainly be able to calculate the change in roll angle, I do not know of a way to use that to rotate the coordinate system. Such a rotation would not normally be required with IMU data, as you are not needing to rotate the orientation of a visible image like with depth or color to orient a picture the 'right way up'.

@FeiSong123
Copy link
Author

I now know how to rotate the coordinate system of data, but what I don't know is the change of roll angle, which I want to get through enable GYRO stream and ACCEL stream.

@MartyG-RealSense
Copy link
Collaborator

If you want to know the difference between the original and final roll angle then that could be done by storing the original angle and new angle in separate variables and subtracting the original angle from the new one.

If you needed to know the change of angle over time then you could potentially convert the gyro's radians per second (rad/sec) to degrees per second (deg/sec) to get the angular velocity, as described at #3107

@FeiSong123
Copy link
Author

It's that original and final roll angle need to get as rs-motion.cpp ? And does it need to build librealsense from sources ? I built it from apt-get install say at https://github.com/IntelRealSense/librealsense/blob/master/doc/distribution_linux.md but can not find #include "example.hpp" and #include "d435.h" which included in rs-motion.cpp.

@FeiSong123 FeiSong123 changed the title How To Use IMU To Rotate Camera Coordinate System How To Use IMU To Get The Change of Roll Angle of L515 When It Is Moving Sep 16, 2022
@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Sep 16, 2022

You could get the original and final roll angle by using rs-motion's gyro_angle.z = gyro_data.z; at the start position of the camera and store it in the variable for the 'original' value, and then repeat gyro_angle.z = gyro_data.z; again when the camera has reached its final position and store the result in a different variable for the 'final' value.

The source code version of librealsense is required if you want to build the SDK example programs such as rs-motion yourself using CMake.

The SDK examples such as rs-motion are built in a different way to how you would build a program that you have created yourself, as described at #10715 (comment)

@FeiSong123
Copy link
Author

FeiSong123 commented Sep 16, 2022

Thank you for your guidance ! I have another problem that I want to enable four stream, process in two thread, color and depth in main thread, gyro and accel in sub-thread. It seems that the frame rate of imu is limited by the other two. What should I do

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Sep 16, 2022

In C++ the recommendable approach to implementing this would be to use callbacks like in the SDK example program rs-callback

https://github.com/IntelRealSense/librealsense/tree/master/examples/callback

image

@FeiSong123
Copy link
Author

Thanks for your help, I can use them by rs-callback. And I have few question on it.

Here is my general code:

    cfg.disable_all_streams();												
    cfg.enable_stream(RS2_STREAM_DEPTH, IMAGE_WIDTH, IMAGE_HEIGHT, RS2_FORMAT_Z16, 30);      
cfg.enable_stream(RS2_STREAM_COLOR, IMAGE_WIDTH, IMAGE_HEIGHT, RS2_FORMAT_BGR8, 30);		
// cfg.enable_stream(RS2_STREAM_POSE, RS2_FORMAT_6DOF, 30);
cfg.enable_stream(RS2_STREAM_GYRO, RS2_FORMAT_MOTION_XYZ32F, 400);		
cfg.enable_stream(RS2_STREAM_ACCEL, RS2_FORMAT_MOTION_XYZ32F, 400);		

// Define frame callback
// The callback is executed on a sensor thread and can be called simultaneously from multiple sensors
// Therefore any modification to common memory should be done under lock
auto callback = [&](const rs2::frame& frame)
{
    std::lock_guard<std::mutex> lock(uptateMutex);
    if (rs2::frameset fs = frame.as<rs2::frameset>())
    {
        for (const rs2::frame& f : fs)
            counters[f.get_profile().unique_id()]++;

        // Handle image
        TickMeter tk;
        tk.start();

        //processing data
        Update(fs);
        showImage();

        tk.stop();
        cout << "total time:" << tk.getTimeMilli() << endl;
        tk.reset();
    }
    else if (rs2::motion_frame motionFrame = frame.as<rs2::motion_frame>())
    {
        if (motionFrame.get_profile().stream_name() == "Gyro")
        {
	// Get the timestamp of the current frame
        	double ts = motionFrame.get_timestamp();
            //Handle Gyro data
        rs2_vector gyroData = motionFrame.get_motion_data();
	processGyro(gyroData, ts);
	// cout << "gyroData:" << gyroData.x << "\t" << gyroData.y << "\t" << gyroData.z << endl;s

           counters[frame.get_profile().unique_id()]++;
        }
        else if (motionFrame.get_profile().stream_name() == "Accel") 
        {
             //Handle Accel data
             rs2_vector accelData = motionFrame.get_motion_data();
	processAccel(accelData);
	//  cout << "accelData:" << accelData.x << "\t" << accelData.y << "\t" << accelData.z << endl;

             counters[frame.get_profile().unique_id()]++;
        }
    }
};

while (true)
{
// std::this_thread::sleep_for(std::chrono::microseconds(2500));

// std::lock_guardstd::mutex lock(uptateMutex);

// std::cout << std::endl;
// for (auto p : counters)
// {
// std::cout << stream_names[p.first] << "[" << p.first << "]: " << p.second << " [frames] || ";
// }
}

I have a few question on it. Callback has opened up at least two threads, with while below , there are at least three threads, right ? Is the depth stream and color stream synchronized and stable at 30FPS? Do I still need operations like disabling exposure priority and how to operate?I don't know if the IMU's two streams are too fast. The thread update angle of the deep stream and color stream is very slow.

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Sep 17, 2022

The default pair of values for the IMU streams (for the 400 Series cameras) are 200 for the gyro and 63 for accel. A faster frequency pairing that is also supported is 400 for the gyro and 250 for accel.

Each IMU data packet is timestamped using the depth sensor hardware clock to allow temporal synchronization between gyro, accel and depth frames.

In your full script, on the pipeline start line have you placed the word callback in the brackets so that the script knows that callbacks should be used, like in the start line of rs-callback:

https://github.com/IntelRealSense/librealsense/blob/master/examples/callback/rs-callback.cpp#L46

I would certainly recommend trying disabling auto-exposure priority. There is C++ code for doing so at #5290 - remember to set the sensor index number to '1' so that the RGB sensor is accessed instead of the depth sensor (which has an index of '0'), as auto-exposure priority is an RGB option and not a depth option.

I do not know how many threads there should be.

@FeiSong123
Copy link
Author

Yes, I have start the callback like this:

rs2::pipeline_profile profiles = pipe.start(cfg, callback);

@FeiSong123
Copy link
Author

When I config it :

cfg.enable_stream(RS2_STREAM_GYRO, RS2_FORMAT_MOTION_XYZ32F, 400);
cfg.enable_stream(RS2_STREAM_ACCEL, RS2_FORMAT_MOTION_XYZ32F, 250);

it reports errors:

terminate called after throwing an instance of 'rs2::error'
what(): Couldn't resolve requests

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Sep 18, 2022

The L515 data sheet document lists that the L515's IMU has different supported frequencies (100, 200, 400) than the IMU in the 400 Series cameras.

ACCEL

image

GYRO

image

This is confirmed when checking the L515's selectable IMU frequencies in the RealSense Viewer.

image

Please try changing the Accel speed to 200 instead of 250.

@FeiSong123
Copy link
Author

It can run as long as the frame rate of gyro and accel is 100/200/400, they don't need to be same.

I have another question here.

I want to obtain gyro and accel separately. Like the code below:

auto callback = [&](const rs2::frame& frame)
{
std::lock_guardstd::mutex lock(uptateMutex);
if (rs2::frameset fs = frame.asrs2::frameset()) //obtain depth and color
{
for (const rs2::frame& f : fs)
counters[f.get_profile().unique_id()]++;

        // Handle image
   
    }
    else if (rs2::motion_frame motionFrame = frame.as<rs2::gyro_frame>())       //obtain Gyro data
    {
            //Handle Gyro data
           counters[frame.get_profile().unique_id()]++;
    }
    else if (rs2::motion_frame motionFrame = frame.as<rs2::accel_frame>())       //obtain Accel data
    {
            //Handle Accel data
            counters[frame.get_profile().unique_id()]++;
    }
};

Is there any way to do this ?

@MartyG-RealSense
Copy link
Collaborator

At #6426 a RealSense user wrote an adaptation of rs-callback that streamed only gyro and accel if a bool called useConfig was set to true, and otherwise used the default stream profile of the camera (depth and RGB) if useConfig was set to false.

Is separating out gyro and accel on its own like this what you had in mind for your own project, please?

@FeiSong123
Copy link
Author

I think Gyro and Accel data should be processed in the same thread. When I run on a computer with a CPU frequency of 3.9MHZ, the roll angle (calculated using Gyro and Accel data) is updated smoothly. When I run on a computer with a CPU frequency of 3.7MHZ (and with large fluctuations), the update is a bit stuck. So I want to separate them and try to solve the problem of stuck.

I tried the following callback code on a computer with low frequency:

auto callback = [&](const rs2::frame& frame)
{
std::lock_guardstd::mutex lock(uptateMutex);
if (rs2::frameset fs = frame.asrs2::frameset()) //obtain depth and color
{
for (const rs2::frame& f : fs)
counters[f.get_profile().unique_id()]++;

    // Handle image

}
else if (frame..get_profile().stream_name() == "Gyro")       //obtain Gyro data
{
        //Handle Gyro data
       counters[frame.get_profile().unique_id()]++;
}
else if (frame..get_profile().stream_name() == "Accel")       //obtain Accel data
{
        //Handle Accel data
        counters[frame.get_profile().unique_id()]++;
}

};

But it doesn't seem to work.

@MartyG-RealSense
Copy link
Collaborator

There is a C++ callback case at #6424 where the RealSense user was not achieving the desired IMU frame rate. They also wrote a script based on rs-callback The solution in their particular case was to disable global time.

Another RealSense callback user found that the color FPS seemed to be dragging down their IMU performance, so it may be worth trying to set color to 60 instead of 30.

@FeiSong123
Copy link
Author

Except that the color stream is changed from 30 frames to 60 frames, nothing else has changed, but the depth seems to be stuck in a certain frame, which is the same for several times.

@FeiSong123
Copy link
Author

Disable global time it didn't work. I had try int another computer whose CPU frequency is 4.0GHZ, when the program runs. The CPU frequency unit I mentioned above is wrong. It should be GHZ. This seems to have nothing to do with the CPU frequency. One thing we noticed is that the CPU of a normally running computer is Intel ® Core ™ i7-8809G CPU @ 3.10GHz × 8 (Memory 31.3 GiB), Intel is not running normally ® Core ™ i9-10880H CPU @ 2.30GHz × 16 (15.5 GiB) and i5 10400f (Momory 15.5 GiB). We don't think that the cause is running memory, because the memory utilization rate is only 0.7% when the program is running.

@MartyG-RealSense
Copy link
Collaborator

If depth frames are repeating for several frames then that suggests that there may be frame drops on the depth stream, as the SDK will return to the last known good frame on a stream if frame drops occur.

Introducing latency into the frame queue by increasing the frame queue capacity above its default size of '1' can help to reduce frame drops.

https://github.com/IntelRealSense/librealsense/wiki/Frame-Buffering-Management-in-RealSense-SDK-2.0#latency-vs-performance

A frame queue capacity of '2' is suggested by the documentation link above when using two streams (depth and color).

@FeiSong123
Copy link
Author

Sorry to bother you, but I still want to know how many threads are created by callback (like the following), and whether gyro and accel are processed in the same thread.

auto callback = [&](const rs2::frame& frame)
{
std::lock_guardstd::mutex lock(uptateMutex);
if (rs2::frameset fs = frame.asrs2::frameset())
{
for (const rs2::frame& f : fs)
counters[f.get_profile().unique_id()]++;

        // Handle image
    }
    else if (rs2::motion_frame motionFrame = frame.as<rs2::motion_frame>())
    {
        if (motionFrame.get_profile().stream_name() == "Gyro")
        {
	// Get the timestamp of the current frame
        	double ts = motionFrame.get_timestamp();
            //Handle Gyro data
	rs2_vector gyroData = motionFrame.get_motion_data();
	processGyro(gyroData, ts);
           counters[frame.get_profile().unique_id()]++;
        }
        else if (motionFrame.get_profile().stream_name() == "Accel") 
        {
            //Handle Accel data
            rs2_vector accelData = motionFrame.get_motion_data();
	processAccel(accelData);
            counters[frame.get_profile().unique_id()]++;
        }
    }
};

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Sep 23, 2022

At #7098 a RealSense team member provides information about the number of threads created. Within that discussion at #7098 (comment) they advise that each stream endpoint has 2 threads and gyro and accel each have a thread.

They add that there are 3 endpoints for 400 Series cameras and 4 endpoints for L515.

@FeiSong123
Copy link
Author

I've been looking for it for a long time. The reason why my program is stuck is probably that only the single core CPU is running. Is there any official program to set the number of CPUs in realsense.

@MartyG-RealSense
Copy link
Collaborator

The ability to define the maximum number of CPU cores that can be used (each core has 2 threads) is only available for installing librealsense from source code, by using a '-j' command such as -j8.

https://github.com/IntelRealSense/librealsense/blob/master/doc/installation.md#building-librealsense2-sdk

However, you can enable programs to make use of multiple CPU cores instead of a single core (but not set the number of cores) by building librealsense from source code and including the CMake build flag -DBUILD_WITH_OPENMP=true.

This will enable librealsense to make use of multiple cores when performing depth-color alignment or YUY to RGB color conversion, at the cost of a slightly higher CPU percentage utilization.

@FeiSong123
Copy link
Author

I'd like to use OpenCv to rotate a image with continuously changing angle. I can see from the image in imshow that when the rotation angle is a fixed value, both srcImage and rotateImage are very smooth. The problem is that when the rotation angle starts to change, the srcImage is still very smooth, but the rotateImage is very stuck. It is worth noting that when the angle is fixed and changed on one of my computers, the two pictures are very smooth. On other computers, the problem is the same as above. I don't know whether it is the problem of environment configuration, and if so, which environment. Here is my code.

Rect bbox;
imageRotationMatix = getRotationMatrix2D(imgCenter, changeAngle, 1.0);//+ inv时针
bbox = RotatedRect(imgCenter, srcImage.size(), changeAngle).boundingRect();
imageRotationMatix.at(0, 2) += bbox.width / 2.0 - imgCenter.x;
imageRotationMatix.at(1, 2) += bbox.height / 2.0 - imgCenter.y;
warpAffine(srcImage, rotateImage, imageRotationMatix, bbox.size());
warpAffine(depthImage, rotateDepthImage, imageRotationMatix, bbox.size());
invertAffineTransform(imageRotationMatix, tempRotationMatix);
cv2eigen(tempRotationMatix, imageInvRotateMatrix);
dstImage = rotateImage.clone();
cv::cvtColor(rotateImage, HSVImage,cv::COLOR_BGR2HSV);
waitKey(1);

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Sep 25, 2022

If the problem occurs when changing an angle in real-time, I wonder whether a rotational angle phenomenon called gimbal lock could be occuring.

https://en.wikipedia.org/wiki/Gimbal_lock

A symptom of a gimbal lock occurring may be if there is a sudden large jump, or 'snap', in the rotation angle from one position to another.

Alternatively, if nothing is happening but the values are still updating then it may be because the calculations in the script that change the value in real-time are causing invalid NaN values (Not a Number) to be generated.

@FeiSong123
Copy link
Author

FeiSong123 commented Sep 26, 2022

Very strange! I saved the srcImage and rateImage through imwrite and found that they were synchronized, but imshow did not display the rotateImage in time. Then I changed the waitKey value from 1 to 3, which is much smoother (the time required for the above code is about 1.5-4ms).

There is no NaN value in the angle of real-time update, and there is no significant mutation. In fact, I have tried to add 1 degree to the rotation angle per cycle (33ms), but it is still too laggy on computers with poor performance before, and it is also very smooth to add 10 degrees per cycle on computers with good performance.

I don't know whether this problem of imshow is related to the computer's graphics processing ability

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Sep 26, 2022

You could check whether adding the SDK's GLSL acceleration function to your script helps. This is done by changing rs2:: instructions in a script to rs2::gl:: so that processing is offloaded from the CPU to the GPU. For example, to add GLSL support to the rs2::pointcloud instruction:

rs2::gl::pointcloud

A C++ demonstration of the GLSL function is provided by the example program rs-gl

https://github.com/IntelRealSense/librealsense/tree/master/examples/gl

https://github.com/IntelRealSense/librealsense/blob/master/examples/gl/rs-gl.cpp

Unlike CUDA graphics acceleration which requires an Nvidia graphics GPU, GLSL is 'vendor neutral'. This means that it can work with any GPU brand, though there may not be noticable improvement on low-power computing devices.

@FeiSong123
Copy link
Author

I'm glad you came to help me. But I think it is not a real sense problem, but an OpenCV problem. rs:: gl In the future, I will learn more about it if I need it.

I have another problem here. I use the angle calculated by gyro and accel. When the robot moves slowly, it feels quite right, but when the robot moves fast, it feels wrong.

@MartyG-RealSense
Copy link
Collaborator

The IMU can be sensitive by default. The notes in the code of the rs-motion C++ example explain how increasing the alpha value will weight the data towards the gyro instead of the accelerometer and so make the IMU less sensitive to disturbances.

https://github.com/IntelRealSense/librealsense/blob/master/examples/motion/rs-motion.cpp#L117-L118

@FeiSong123
Copy link
Author

What range can I adjust the alpha value, 0-1 ?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Sep 27, 2022

The alpha is 0.98f by default in the script. As I have seen calculations that are '1-alpha', that would suggest to me that alpha should not be greater than 1, otherwise the total of the sum would be a minus value when 1 - alpha was calculated.

Does sensitivity reduce if you use 0.99f

@MartyG-RealSense
Copy link
Collaborator

Hi @FeiSong123 Do you require further assistance with this case, please? Thanks!

@FeiSong123
Copy link
Author

I'm sorry I haven't replied for several days. Because of the new epidemic, we have been isolated these days.

It seems that the effect is better when the alpha is changed to 0.99.

Another problem is the zero drift of the angular velocity meter. Is there any way to optimize it.

@MartyG-RealSense
Copy link
Collaborator

Is the 'zero drift' related to a problem that you described earlier in this discussion that stated I have another problem here. I use the angle calculated by gyro and accel. When the robot moves slowly, it feels quite right, but when the robot moves fast, it feels wrong.

@FeiSong123
Copy link
Author

I learn this word from other gyro program. And its phenomenon is that when you don't move, as time goes on, the Angle changes.

@MartyG-RealSense
Copy link
Collaborator

If it is the Accel that is changing when the camera is not moving, this is a known issue with the RealSense IMU component used in the camera, as the IMU does not not have internal hardware calibration.

If an IMU calibration is performed with the Python IMU calibration tool at the link below and saved to the camera hardware then a new option in the RealSense Viewer's motion module controls is unlocked that allows the IMU values to be automatically 'corrected' to more accurate values.

https://github.com/IntelRealSense/librealsense/tree/master/tools/rs-imu-calibration

The correction feature can be enabled in C++ scripting with the instruction RS2_OPTION_ENABLE_MOTION_CORRECTION

sensor.set_option(RS2_OPTION_ENABLE_MOTION_CORRECTION, 1);

@FeiSong123
Copy link
Author

How can I get intrinsics of gyro and accel like #10295 (comment) and extrinsics of accel or gyro to color like #10180 (comment) using C++.

@MartyG-RealSense
Copy link
Collaborator

The L515 uses the D435i's coordinate system. My understanding is that to compare the extrinsics of a pair of sensors, you should define separate profiles for the two stream types that you will be comparing and then use get_extrinsics_to, as described in the C++ code at #6236

For example:

rs2::stream_profile gyroStream = pipeline.get_active_profile().get_stream(RS2_STREAM_GYRO);
rs2::stream_profile colorStream = pipeline.get_active_profile().get_stream(RS2_STREAM_COLOR);

rs2_extrinsics rsExtrinsics = gyroStream.get_extrinsics_to(colorStream);

You can retrieve the accel and gyro intrinsics with get_motion_intrinsics(), as described at #2905

@FeiSong123
Copy link
Author

Thank you very much for your help for so long. I have no problems on this topic now. You can close this topic at any time. If there are still problems in this area in the future, I will reopen this topic and hope it will not disturb you. Thank you again.

@MartyG-RealSense
Copy link
Collaborator

You are very welcome. It is no trouble at all and I am pleased that I was able to help. As you do not require further assistance at this time, I will close the case as you suggested. Please do re-open the case or create a new one if you have further questions at a later time. Thanks again!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants