Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

export_to_ply sometimes save wrong point cloud #6560

Closed
EmmanuelCos opened this issue Jun 10, 2020 · 28 comments
Closed

export_to_ply sometimes save wrong point cloud #6560

EmmanuelCos opened this issue Jun 10, 2020 · 28 comments

Comments

@EmmanuelCos
Copy link

Required Info
Camera Model D435
Operating System & Version Win10
Platform PC
Language C++

Hello,
I am using the rs-multicam with two cameras. At the end of the process, when the user leaves the app, I save the point clouds with export_to_ply.
Most of the time it works good, but sometimes the point cloud have wrong scale, and color not aligned with depth (without changing anything to the code...).

What I am doing differently from the standard program is that I do some retry for the poll_for_frames, and then when I have my frameset, I change several parameters (like temporal filter etc...).

Here is what I am doing to save the point clouds :

//for (auto &&pipe : pipelines){
for (auto &&pipe : pipelines) {

	///auto frames = pipe.wait_for_frames();
	rs2::frameset frames;
	while (!pipe.poll_for_frames(&frames)) {
		
	}
	//setParameters(frames, pipe.get_active_profile());
	const char* serialNumber = rs2::sensor_from_frame(frames)->get_info(RS2_CAMERA_INFO_SERIAL_NUMBER);
	std::string currentName;

	char* dirToSave = argv[1];
	std::stringstream ss;
	ss << dirToSave;
	std::string dirStr = ss.str();
	char* imageName = argv[2];
	std::stringstream ss2;
	ss2 << imageName;
	std::string imNameStr = ss2.str();

	if (strcmp(serialNumber, serialNumbers[0].c_str()) == 0) {
		currentName = dirStr + "/" + imageName + "RR.ply";
	}
	else if (strcmp(serialNumber, serialNumbers[1].c_str()) == 0) {
		currentName = dirStr + "/" + imageName + "RL.ply";
	}
	else {
		currentName = dirStr + "/" + imageName + "ERROR.ply";
	}

	//rs2::align align(RS2_STREAM_DEPTH);//RS2_STREAM_COLOR
	//auto aligned_frames = align.process(frames);
	auto color = frames.get_color_frame();
	rs2::pointcloud pc;
	// Tell pointcloud object to map to this color frame
	pc.map_to(color);
	auto depth = frames.get_depth_frame();
	// Generate the pointcloud and texture mappings
	rs2::points points;
	points = pc.calculate(depth);
	points.export_to_ply(currentName, color);
}
@MartyG-RealSense
Copy link
Collaborator

Hi @EmmanuelCos I could not identify where in your program you are changing the temporal filter. If you are doing it after alignment though then it is recommended that post-processing filters are applied before alignment to help avoid distortions such as aliasing (jagged lines).

@EmmanuelCos
Copy link
Author

Hello MartyG.
Yes I am doing it before alignement.
Here is my entire code :

void setParameters(rs2::frameset frames, rs2::pipeline_profile &profile) {


rs2::decimation_filter dec_filter;
rs2::disparity_transform depth_disparity;
rs2::spatial_filter spat_filter;
rs2::temporal_filter temporal_filter(0.4, 20.0, 8);
rs2::hole_filling_filter hole_fill_filter;
frames = temporal_filter.process(frames);

auto depth = frames.get_depth_frame();

auto color = frames.get_color_frame();
int widthC = color.get_width();
std::cout << "widthC " << widthC << std::endl;

rs2::device selected_device = profile.get_device();


// JSON, short range preset 
auto advanced_mode_dev = selected_device.as<rs400::advanced_mode>();
std::string json_content = "{\"param-censususize\": 9,\n\"param-censusvsize\" : 3,\n\"param-disableraucolor\" : 0,\n\"param-disablesadcolor\" : 0,\n\"param-disablesadnormalize\" : 0,\n\"param-disablesloleftcolor\" : 0,\n\"param-disableslorightcolor\" : 0,\n\"param-lambdaad\" : 2000.0,\n\"param-lambdacensus\" : 39,\n\"param-leftrightthreshold\" : 52,\n\"param-maxscorethreshb\" : 2000,\n\"param-medianthreshold\" : 100,\n\"param-minscorethresha\" : 13,\n\"param-neighborthresh\" : 1,\n\"param-raumine\" : 6,\n\"param-rauminn\" : 1,\n\"param-rauminnssum\" : 1,\n\"param-raumins\" : 1,\n\"param-rauminw\" : 4,\n\"param-rauminwesum\" : 8,\n\"param-regioncolorthresholdb\" : 0.02404790738643174,\n\"param-regioncolorthresholdg\" : 0.05924149629687343,\n\"param-regioncolorthresholdr\" : 0.1693732549608611,\n\"param-regionshrinku\" : 4,\n\"param-regionshrinkv\" : 1,\n\"param-regionspatialthresholdu\" : 7,\n\"param-regionspatialthresholdv\" : 3,\n\"param-robbinsmonrodecrement\" : 11,\n\"param-robbinsmonroincrement\" : 1,\n\"param-rsmdiffthreshold\" : 4.707440954582961,\n\"param-rsmrauslodiffthreshold\" : 0.007420896027247603,\n\"param-rsmremovethreshold\" : 0.31771802714889097,\n\"param-scanlineedgetaub\" : 849,\n\"param-scanlineedgetaug\" : 515,\n\"param-scanlineedgetaur\" : 736,\n\"param-scanlinep1\" : 5,\n\"param-scanlinep1onediscon\" : 341,\n\"param-scanlinep1twodiscon\" : 248,\n\"param-scanlinep2\" : 56,\n\"param-scanlinep2onediscon\" : 28,\n\"param-scanlinep2twodiscon\" : 415,\n\"param-secondpeakdelta\" : 26,\n\"param-texturecountthresh\" : 0,\n\"param-texturedifferencethresh\" : 500,\n\"param-usersm\" : 1\n}";
advanced_mode_dev.load_json(json_content);
// JSON, short range preset

auto depth_sensor = selected_device.first<rs2::depth_sensor>();


depth_sensor.set_option(rs2_option::RS2_OPTION_VISUAL_PRESET, rs2_rs400_visual_preset::RS2_RS400_VISUAL_PRESET_DEFAULT); //RS2_RS400_VISUAL_PRESET_DEFAULT


depth_sensor.set_option(RS2_OPTION_ENABLE_AUTO_EXPOSURE, 0); // Enable emitter
															 //CUSTOM MANU
depth_sensor.set_option(RS2_OPTION_GAIN, 16); // Enable emitter //default 16
depth_sensor.set_option(RS2_OPTION_EXPOSURE, 5000); // Enable emitter //default 8500
depth_sensor.set_option(RS2_OPTION_LASER_POWER, 30); // Enable emitter //default 150
depth_sensor.set_option(RS2_OPTION_DEPTH_UNITS, 0.0003);//default 0.001
//depth_sensor.set_option(RS2_OPTION_DEPTH_UNITS, 0.001);//default 0.001


auto color_sensor = selected_device.query_sensors()[1];

color_sensor.set_option(RS2_OPTION_ENABLE_AUTO_EXPOSURE, 0); // Enable emitter
color_sensor.set_option(RS2_OPTION_EXPOSURE, 5000.0);//2500. 10000 = max
color_sensor.set_option(RS2_OPTION_GAMMA, 300.0);//300 //500
color_sensor.set_option(RS2_OPTION_GAIN, 60.0);//default is 60//128.0 improved to see brighter

//rs2::process_interface opti;

// Configure filter parameters
dec_filter.set_option(RS2_OPTION_FILTER_MAGNITUDE, 1);//By default 2

spat_filter.set_option(RS2_OPTION_FILTER_MAGNITUDE, 2);//By default 2
spat_filter.set_option(RS2_OPTION_FILTER_SMOOTH_ALPHA, 0.50f);
spat_filter.set_option(RS2_OPTION_FILTER_SMOOTH_DELTA, 20);
spat_filter.set_option(RS2_OPTION_HOLES_FILL, 2);


int N = 30;

rs400::advanced_mode advanced_device(selected_device);
auto depth_table = advanced_device.get_depth_table();
//depth_table.depthClampMax = 1100; // 1m30 if depth unit at 0.001// mettre entre 600 et 1500
depth_table.depthClampMax = 4000; // 1m30 if depth unit at 0.001// mettre entre 600 et 1500
depth_table.disparityShift = 50;//50

advanced_device.set_depth_table(depth_table);
//CUSTOM MANU - change la profondeur maximale - END

STDepthControlGroup depth_control = advanced_device.get_depth_control();

depth_control.deepSeaNeighborThreshold = 120;//150 mieux au niveau des bords
depth_control.deepSeaSecondPeakThreshold = 700;
depth_control.scoreThreshB = 1000;



advanced_device.set_depth_control(depth_control);
 }

bool tryPolForFrame(rs2::pipeline& pipe, rs2::device& dev, rs2::context &ctx, std::map<std::string, rs2::colorizer> &colorizers, rs2::frameset& frames, rs2::pipeline_profile &profile) {

pipe = rs2::pipeline(ctx);
rs2::config cfg;
cfg.enable_device(dev.get_info(RS2_CAMERA_INFO_SERIAL_NUMBER));
//cfg.enable_device( serialNumbers[i] );
std::cout << "cam to enable:  " << dev.get_info(RS2_CAMERA_INFO_SERIAL_NUMBER) << std::endl;
profile = pipe.start(cfg);
// Map from each device's serial number to a different colorizer
colorizers[dev.get_info(RS2_CAMERA_INFO_SERIAL_NUMBER)] = rs2::colorizer();
//colorizers[serialNumbers[i]] = rs2::colorizer();


int numRetry = 0;
int numRetryMax = 100000000;
while (!pipe.poll_for_frames(&frames) && numRetry < numRetryMax) {
	numRetry++;
	//std::cout << "retry " << numRetry;
}
std::cout << "NUM retry : " << numRetry << std::endl;


if (numRetry == numRetryMax)
	return false;
else
	return true;
}






int main(int argc, char * argv[]) try
{

int numCam = 2;

std::cout << "Num Cam " << numCam << std::endl;
std::vector<std::string> serialNumbers(numCam);
for (int iCam = 0; iCam < numCam; iCam++) {
	char* SNc = argv[3 + iCam];
	std::stringstream ssI;
	ssI << SNc;
	std::string SNstr = ssI.str();
	serialNumbers[iCam] = SNstr;
}



rs2::context   ctx;        // Create librealsense context for managing devices

std::map<std::string, rs2::colorizer> colorizers; // Declare map from device serial number to colorizer (utility class to convert depth data RGB colorspace)

std::vector<rs2::pipeline> pipelines;

// Start a streaming pipe per each connected device
for (auto&& dev : ctx.query_devices())
//for (int i=0; i< serialNumbers.size(); i++)
{

	rs2::frameset frames;
	rs2::pipeline pipe;
	rs2::pipeline_profile profile;
	int numRet = 0;
	int numMaxRet = 5;

	while (!tryPolForFrame(pipe, dev, ctx, colorizers, frames, profile) && numRet < numMaxRet)       {
		numRet++;
	};

	setParameters(frames, profile);

	pipelines.emplace_back(pipe);

}





// We'll keep track of the last frame of each stream available to make the presentation persistent
std::map<int, rs2::frame> render_frames;

// Create a simple OpenGL window for rendering:
window app(1280, 960, "CPP Multi-Camera Example");

while (app)
{
    // Collect the new frames from all the connected devices
    std::vector<rs2::frame> new_frames;
    for (auto &&pipe : pipelines)
    {
        rs2::frameset fs;

        if (pipe.poll_for_frames(&fs))
        {

			// First make the frames spatially aligned
            for (const rs2::frame& f : fs)
                new_frames.emplace_back(f);


        }

    }

    // Convert the newly-arrived frames to render-friendly format
    for (const auto& frame : new_frames)
    {
        // Get the serial number of the current frame's device
        auto serial = rs2::sensor_from_frame(frame)->get_info(RS2_CAMERA_INFO_SERIAL_NUMBER);
        // Apply the colorizer of the matching device and store the colorized frame
        render_frames[frame.get_profile().unique_id()] = colorizers[serial].process(frame);
    }

    // Present all the collected frames with openGl mosaic
    app.show(render_frames);

}


//for (auto &&pipe : pipelines){
for (auto &&pipe : pipelines) {

	///auto frames = pipe.wait_for_frames();
	rs2::frameset frames;
	while (!pipe.poll_for_frames(&frames)) {
		
	}
	//setParameters(frames, pipe.get_active_profile());
	const char* serialNumber = rs2::sensor_from_frame(frames)->get_info(RS2_CAMERA_INFO_SERIAL_NUMBER);
	std::string currentName;

	char* dirToSave = argv[1];
	std::stringstream ss;
	ss << dirToSave;
	std::string dirStr = ss.str();
	char* imageName = argv[2];
	std::stringstream ss2;
	ss2 << imageName;
	std::string imNameStr = ss2.str();

	if (strcmp(serialNumber, serialNumbers[0].c_str()) == 0) {
		currentName = dirStr + "/" + imageName + "RR.ply";
	}
	else if (strcmp(serialNumber, serialNumbers[1].c_str()) == 0) {
		currentName = dirStr + "/" + imageName + "RL.ply";
	}
	else {
		currentName = dirStr + "/" + imageName + "ERROR.ply";
	}

	//rs2::align align(RS2_STREAM_DEPTH);//RS2_STREAM_COLOR
	//auto aligned_frames = align.process(frames);
	auto color = frames.get_color_frame();
	rs2::pointcloud pc;
	// Tell pointcloud object to map to this color frame
	pc.map_to(color);
	auto depth = frames.get_depth_frame();
	// Generate the pointcloud and texture mappings
	rs2::points points;
	points = pc.calculate(depth);
	points.export_to_ply(currentName, color);
}


return EXIT_SUCCESS;
}

catch (const rs2::error & e)
{
std::cerr << "RealSense error calling " << e.get_failed_function() << "(" << e.get_failed_args() << "):\n " << e.what() << std::endl;
return EXIT_FAILURE;
}
catch (const std::exception & e)
{
std::cerr << e.what() << std::endl;
return EXIT_FAILURE;
}

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Jun 10, 2020

I read through your script carefully. I note that you are using poll_for_frames, which is appropriate for a multicam setup. You need to be careful about when to put the CPU to sleep and for how long though when using this instruction, so that CPU percent usage is not maxed out (which could introduce problems as processing jams up).

#2422 (comment)

@EmmanuelCos
Copy link
Author

Thank you for the answer. Actually I don't put the CPU to sleep. How should I do that?

@MartyG-RealSense
Copy link
Collaborator

A sleep instruction typically takes this form:

std::this_thread::sleep_for(std::chrono::milliseconds(10));

Where the number value in brackets is the number of milliseconds (10 in this example) that you want the CPU to sleep for.

@EmmanuelCos
Copy link
Author

Ok but should I systematically use sleep after pole_for_frame? How long should I set the CPU to sleep?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Jun 10, 2020

I found a multicam example from Dorodnic that uses sleep just after poll_for_frames, and uses a value of 1 millisecond:

#2219 (comment)

@EmmanuelCos
Copy link
Author

I've tried to add a sleep for 1s after poll_for_frames returns true, but I still get the problem from time to time.
Here is a right and a wrong point cloud, generated with the same camera, same position and same code (the wrong right has bad scale and colors don't feet to depth ):

Capture

@MartyG-RealSense
Copy link
Collaborator

Comparing your wrong and correct images, visually it looks as though something may be going wrong in the application of the Decimation filter and rendering the image at 1/4 scale. I wonder if when it goes wrong, it is applying decimation after align instead of before.

@EmmanuelCos
Copy link
Author

Actually I don't apply any decimation filter, and the alignment is done after I set all the parameters.

Here it looks like there is a 1/4 scale, but sometimes it is much more different than that, can be ~1/10, etc...

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Jun 10, 2020

The minimum depth scale is 0.0001. So 0.0003 (larger than 0.0001) should be within allowable bounds. The depth unit scale can cause scale problems when viewing an exported ply in an external 3D modeling program such as MeshLab, Blender or Maya though. If the scale set in RealSense when the ply was created is not the same as the scale set in the 3D software package being used to view the ply then the imported point cloud can appear mis-scaled and distorted.

So if you created the point cloud with a 0.0003 depth unit scale then any program being used to view the ply should be set to that too.

@EmmanuelCos
Copy link
Author

Yes but it still doesn't explain the non alignment between color and depth, and why it only accurs from time to time.

@MartyG-RealSense
Copy link
Collaborator

Does your project involve starting, stopping and then re-starting the streams?

@EmmanuelCos
Copy link
Author

The function, tryPolForFrame above, uses
profile = pipe.start(cfg);
This function may be called several times if the poll_for_frames didn't work. But I don't use stop()

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Jun 10, 2020

Okay, I was just asking because doing start-stop-start in the same program session can cause the SDK to become confused, especially if more than one stream is toggled on and off (e.g depth on, IR off, depth on).

@MartyG-RealSense
Copy link
Collaborator

Hi @EmmanuelCos Do you still require sssistance with this case please? Thanks!

@EmmanuelCos
Copy link
Author

Hello MartyG,
Thank you for updating me on this. Actually I still have the problem from time to time, and still can't explain why.

@MartyG-RealSense
Copy link
Collaborator

I had dealt with a depth units question earlier today, and had come across an example from a RealSense team member for setting the depth units with C++.

#2385 (comment)

It looks as though you have written the instruction correctly, and have also correctly put the instruction before the pipeline starts. Maybe the above link will give a useful insight for your own program though.

@EmmanuelCos
Copy link
Author

Hello MartyG,

They advice to set depth options before streaming. So I will try this way.

I also noticed that they use enable_stream(), while I only use enable_device. Do you know what is the difference exactly and what is the best practise?

Thank you,

Emmanuel

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Jun 19, 2020

My understanding is that enable_device is used to make queries to the camera hardware, such as retrieving its serial number, whilst enable_stream is used to define the streams if you are defining a custom stream configuration instead of using the camera's default configuration. For example, to define a custom depth stream in C++:

config.enable_stream(RS2_STREAM_COLOR, 640, 480, RS2_FORMAT_BGR8, 30);

@EmmanuelCos
Copy link
Author

I am not sure if it's still possible to do it this way nowadays because this line
auto depthSensor = dev.firstrs2::depth_sensor();
doesn't work. firstrs2 is not a member of dev.

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Jun 19, 2020

Looking at the example script provided by the RealSense team member, it looks as though they set up a list and a dev device definition before calling dev in auto depthSensor = dev.firstrs2::depth_sensor();

image

@EmmanuelCos
Copy link
Author

Yes I did that too, but it doesn't work for me because firstrs2 isn't a member of rs2::device.
I am wondering if it's because the case is dated from 2018 and maybe this code isn't up to date anymore?

@MartyG-RealSense
Copy link
Collaborator

The example script I linked to is the only place I can find where the instruction is formatted as dev.firstrs2::depth_sensor(); ... most of the other examples seems to format it as a variation on this line:

auto sensor = profile.get_device().firstrs2::depth_sensor();

This is similar to the format in your own script, which was:

auto depth_sensor = selected_device.firstrs2::depth_sensor();

The main difference that I see is that instead of using profile directly, you use the selected_device variable that is equal to profile.get_device()

Does it make any difference if you take out selected_device and put in the full device definition instead?

auto depth_sensor = profile.get_device().firstrs2::depth_sensor();

I appreciate your patience of my programming knowledge limitations!

@EmmanuelCos
Copy link
Author

Ah I understand it's because the <> were delited when writting it here in github...
The true line code is:
auto depthSensor = dev.first<rs2::depth_sensor>();
Sorry I should have noticed that.
So I manage to change the development, I will keep you in touch on the next days to tell if it's working or not.
Thank you for your help!

@MartyG-RealSense
Copy link
Collaborator

Ah yes, I see. I didn't notice either that the < > was being deleted from my comment. Sorry about the confusion!

image

@MartyG-RealSense
Copy link
Collaborator

Hi @EmmanuelCos Do you still require assistance with this case please, or can it be closed? Thanks!

@MartyG-RealSense
Copy link
Collaborator

Case closed due to no further comments received.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants