cancel
Showing results for 
Search instead for 
Did you mean: 

Working PS Move plugin for UE4; need help coreg with DK2

bullale
Honored Guest
Update:

The plugin is not done but it's ready for public testing. I made a new thread here.


Hello all,

I recently made a PSMove plugin for UE4 that works in both Windows and OSX. It should work in Linux too with a simple compile of the psmoveapi binaries. The changes I had to make to the psmoveapi and PS3EYEDriver should also help anyone that wants to use the PSMove controllers in Windows or OSX (it's always worked in Linux) outside of UE4.

I need help co-registering the coordinates I get from the psmove (in cm from the camera) with the DK2's coordinate space.

For orientation, the roll and pitch are easy. The PSMove has a magnetometer so it's possible to get its true yaw but to make that useful in game I would also need access the DK2's magnetometer (I've read mixed reports that it even has one). Does anyone have any ideas here? Does it truly have a magnetometer? If so, does anyone know how to access it from within UE4?

If it does not then I'll need to somehow coregister the orientations by relying on the user to somehow point the controller in a specific orientation and pressing a button. I'm not too excited about that option.

For position, I would love it if I could access the DK2's camera's image feed. I could have the users run a quick calibration app that asked them to draw in the air then for each frame I would estimate the PSMove controller's position in 3D space relative to each camera, then create a transformation matrix from PS3Eye coordinates -> DK2 camera coordinates. This matrix could be stored in a data file somewhere (or registry entry) and as long as the cameras don't move relative to each other then it should stay the same. Any ideas on accessing the DK2's images, even outside of a game engine or the SDK?

If I can't access the DK2's camera feed then I'll probably need to touch the PSMove at specific locations on the surface of the DK2 and sample >=4 fiducial locations. I would need accurate dimensions of the DK2 body relative to whatever origin
it uses. Anyone have any experience with this?

Accuracy is important to me. We've all read that accuracy goes a long way to helping with immersion, so I hope that it's important to you too, but it's especially important to me because I'm using the data for research. So I can't really accept calibration methods that simply ask the user to put the controller near where they think it should be in the world and press a button.
180 REPLIES 180

bullale
Honored Guest
"SethVR" wrote:
I wish there was a way to make this work with the gearvr using the note 4 camera to track the move controller connected by bluetooth


It should be possible. However, is the Note 4 "strong" enough to do the image processing while running GearVR? Below is sort of an intellectual exercise in what it would take to get the psmoveapi working in the Note4, but I think a better solution would be to write code optimized specifically for Android including custom image processing.

The psmoveapi is designed for use on Linux first, Mac second, and Windows a distant third. I suspect it would be possible to modify it to work on Android too. The only external libraries that are absolutely necessary are hidapi, opencv, and the Madgwick IMU sensor fusion algorithm. The former two definitely have Android implementations, and I think the latter does too.

The Note4's camera is actually quite good, so I think you would get good tracking with it. And there wouldn't be a co-registration problem because you'd always know exactly where the camera is relative to the headset.

I don't have a Note 4 and I'm not about to buy one. But if there are any Android developers with a Note 4 and a PSMove then I can point out the parts of the psmoveapi they'd have to test and modify. Off the top of my head:

Make sure the correct camera is found. Android has linux v4l2 so it should just be a matter of enumerating the devices and selecting the correct one automatically.

Make a new pairing process (relies on USB connection to the host device, but most users won't have the correct mini-USB (PSMove) <--> micro?USB (Note4), so it'll be better to use a computer to send the Note4's bluetooth address to the PSMove).

That's it for the psmoveapi.

Then you'd have to make the Unity/UE4 plugin understand the different coordinate system of the Note4 camera when in landscape mode, and you'd have to hardcode the position offset relative to the GearVR. But these are trivial.

zalo
Explorer
Hey, I wrote a C# script a while back for doing blob tracking in Unity off the webcam interface:


(In this example, I have my phone's screen displaying a Cyan square which (is filled black and) changes size when I turn my phone on its side).

It's probably more portable (for Gear VR stuff) than the main PSEye lib; let me know if this might be useful to you and I'll see if I can clean up the code a little.

It doesn't have camera calibration worked in yet, which will become particularly important once the GearVR gets a fisheye lens for inside-out positional tracking.

Fredz
Explorer
I guess it should be possible, IIRC the PS Move API did work on Android previously, no idea if that's still the case. But with the low FOV of the camera I'm not sure that would be great for positional tracking. Maybe with a fisheye lens, but then it's more complicated to undistort the capture.

Fredz
Explorer
Some progress report on my attempt at co-registration.

I've updated the code to log the values only when the DK2 is mostly facing the camera (angle < 5° from the camera axis), else I think the co-registration is going to be more difficult to calc. It's quite difficult to stay correctly oriented so I'll probably add visual indications about which axis the DK2 must be oriented to correctly face the camera.

I display cyan and yellow squares in real-time at the positions that have been captured. The pairs seem to be separated by a similar distance when I move the DK2, with the PS Move is attached on it with two rubber bands. There are a lot of hiccups but it's probably because the batteries are low on the Move (forgot to put it on charge yesterday). When there are no hiccups visually, the trails are similar.

I'll then calc the range of distances between pairs to see if the distance is really mostly similar, but I fear that there must be distortions at the edges of the camera FOVs since I didn't calibrate the PS Eye with the tracker_camera_calibration utility available with the PS Move API.

If there is distortion I'll try either to calibrate the PS Eye with the utility, hoping this calibration will work with other people cameras or I'll try to do it all at once with the co-registration if that's even possible.

If anyone has better ideas about which direction I should try don't hesitate to tell.

Obligatory screenshot :

PS Move Setup 3.png
Sorry if my English is barely readable, I'm dead tired tonight. 😛

bullale
Honored Guest
See this post for someone else's calibration results.

Fredz
Explorer
Thanks. I read that post some days ago but it didn't seem relevant at the time, but now it clearly us. I'll see if I can use these values directly.

AlphaWolF
Honored Guest
"Fredz" wrote:
I've installed Zadig and now the camera turns red when I launch test_tracker.exe. The PS Move doesn't light up with the WinUSB driver and I have this message, like when the CL Eye driver was installed :

### Found 1 controllers.
Trying to init PSMoveTracker...Warning: No lens calibration files found.
C:\Users\Fred\AppData\Roaming\.psmoveapi\colormapping.dat is too old - not restoring colors.

But with the libusb-win32 driver it does work, the orb lights up in magenta and a window is opened that shows it being tracked. The tracking works flawlessly. Many thanks !

Now I'll see if it does work inside Unity in 32 bit and 64 bit mode and then I'll have a look at calibrating the PS Move using the DK2 positional tracking. If you want me to investigate something in particular don't hesitate to ask.

Thanks again, great job !


No matter what driver I use ,No matter what port I use Im also suck with this problem . and I get the same error message. Trying all the other tools in the psmoveapi_3.0.0_win32 & the Move framework, My PS move is perfectly paired and calibrated from what I could tell using the calibrationtool_3.2 in the Moveframe work folder .

But all I get the the red Light on the EYE & Warning: No lens calibration files found error

bullale
Honored Guest
I messaged you on reddit but I'll post here too.

Please read my new Wiki entry for setting up the PSEye. At the bottom is the output I get from the command prompt when running test_tracker.exe. Notice that I have the same warning you do. It's just a warning, not an error. (the bottom is new since after I sent you the reddit comment, so check again if you looked before and didn't see it).

You don't need the camera for the plugin if all you want is rotation, buttons, and vibration. If you leave the camera unplugged and run the UE4 plugin that should still work, assuming your controller is paired and connected.
(BTW, there was a tiny bug in the plugin that prevented it from being built in Windows that I fixed just this afternoon).

Fredz
Explorer
"AlphaWolF" wrote:
No matter what driver I use ,No matter what port I use Im also suck with this problem

The first time I used the libusb-win32 driver with Zadig it worked correctly (still with the warning message), but it did not with the WinUSB driver

Yesterday I plug/unplugged other USB devices and it did no longer work. I had to find a combination that worked by testing several, one at a time and rebooting between them. It's far from ideal but you may have a try at that, until someone finds out where the problem is.

On another note I've relaunched my tests with a fully charged PS Move and the result is much better (but still with some hiccups) :

PS Move Setup 5.png
You can see that the tracking is now very similar but that it seems to differ a bit on the left, I guess it's because of the simple algorithm used for the tracking (circle approximation). So even if I used calibration data from the camera the tracking precision would most certainly suffer from the tracking algorithm.

I think a more complex algorithm that doesn't rely on a circular or even ellipsoidal shape should be implemented, since I'm not sure the projection of a sphere on a plane is even an ellipse. I've read that here (search "not an ellipse" on the page) : http://www.handprint.com/HP/WCL/perspect5.html

Illustration :



So for now I'll simply continue with co-registration.

Anonymous
Not applicable
Hi Bullale/Fredz, I followed you over from the Reddit topic (not stalking ;-))

I'm trying to get the moves working so I can start playing around with them in Unity/UE4, but I'm having a few higher-level issues at the moment.

I've got a PSEye and two PSMove controllers I sourced of eBay. I can get the PSEye driver to work fine with libusb, but I'm hitting quite a few issues with pairing the moves via BT to my Windows 8.1 machine. I've pulled your fork of the psmoveapi repo and I've built everything using MSVC2013 (libusb, OpenCV and psmoveapi).

Now, when I plug either of my move controllers in via mini-USB cable (I've tried both), the little red LED flashes in 5 second intervals. I then run psmove-pair-win.exe and it says it found a controller. Then it hangs for ages. During this time I can hear the Windows sound-cue for a USB device plugging-in/unplugging in time with the red LED flashes. Is this normal?

What I've also found is that, if I push the PS button on the move while the LED is lit, psmove-pair-win.exe will progress but, after about 5 seconds, the USB unplugging sound chimes and the LED goes out. I can repeat the "push PS button" multiple times and eventually psmove-pair-win.exe finishes, but when I unplug the USB cable can press the PS button to connect it via Bluetooth nothing happens (no LEDs, no orb light etc).

I hate to sound thick, but am I doing it wrong? Am I always supposed to have the USB cable plugged in (I got confused when psmovepair.exe said to unplug it and use BT to do the rest)?

I've managed to complete the pairing twice now, but only when the moves are connected via USB permanently and if I press the PS button multiple times everytime it disconnects/LED goes out. After this I was able to do the following:


  • unplug move

  • plug move in

  • wait for led to light up and "USB connect" sound to chime

  • if I'm quick enough, and I launch test_tracker.exe while the LED is on and USB is still connected then a terminal opens up and says it found a controller but doesn't go any further

  • Pressing the PS button every 5 seconds in time with the red LED, I can eventually get the pairing/tracking app to start.


All in all, I've successfully paired/synced and run test_tracker.exe once in about 4 hours of trying and I can't repeat it.

Queries:

  • Is USB always required?

  • Is windows supposed to constantly connect/disconnect the move until something using the psmoveapi tries to take control of the device?

  • Should repeatedly pressing the PS button be required to pair/use the move controllers?


I know this doesn't really contribute to developing psmoveapi further, but just being able to use the device would be a good first step 😛

Thank you for any help you can provide.

Cheers,
cpldave.