Attended Center CLR Try! Development #2
I recently attended “Center CLR Try! Development #2”.
The venue was different from #1, right in front of Kanayama Station, making it easily accessible. However, the target building was under construction, making the entrance hard to find, and I wandered around for a few minutes… (Being directionally challenged, I initially went to the wrong building).
There were three participants that day (Mr. Matsui, Mr. Matsuoka, and myself). Being alongside two amazing individuals made me a little, no, quite nervous. But thanks to the small group size, I was able to talk about various things, making it a meaningful day.
What I Did
I had originally planned to spend the day working on an app I was developing in Unity, so that’s what I did.
I’m creating a prize lottery app to be used at my department’s year-end party.
I’ve been using Unreal Engine as my game development tool for a long time, but I had no experience with Unity, so I decided to give Unity a try starting this month. (* Although I use Unreal Engine a lot, I’ve never actually made a game.)
By the way, the deadline I set for myself is the end of November, so the duration is one month.
A Bit More Detail
Using Unity’s UI features (equivalent to UMG in Unreal), I was creating the participant registration screen for the year-end party and implementing a feature to register photos taken with a smartphone camera.
I was worried I might have to use OS-dependent APIs for camera shooting, but it was surprisingly easy to achieve using only Unity’s features!
(I’m requesting camera usage permission in the app beforehand.)
// Get the camera attached to the execution device (the first one if there are multiple)
var cameraDevice = WebCamTexture.devices[0];
// Create a texture to project the camera feed onto
cameraTexture = new WebCamTexture(cameraDevice.name, 1280, 720);
// Start shooting
cameraTexture.Play();
// Display the camera feed on the UI (targetImage is a reference to a RawImage on the UI)
targetImage.texture = cameraTexture;
Running the code above displays the camera feed in real-time on the RawImage in the UI. I assume it’s transferring the camera feed to the RawImage’s texture every frame.
This time, I wanted to save a still image, so I executed the following code when a UI button was pressed to save the image data.
// Get the pixel buffer of the camera image at that moment
Color32[] color32 = cameraTexture.GetPixels32();
// Create a texture to display the camera image
var cameraImage = new Texture2D(cameraTexture.width, cameraTexture.height);
// Copy the pixel buffer to the texture
cameraImage.SetPixels32(color32);
cameraImage.Apply();
// Stop the camera shooting
cameraTexture.Stop();
Now, cameraImage
contains the image captured by the camera as a texture. By inserting this into the UI’s RawImage, I achieved the desired functionality.
That’s what I did today. In the future, I plan to shrink and crop this captured image for thumbnails and serialize it so it can be used on the next launch.
Summary
- Managed to capture smartphone camera images in Unity.
- Small-group events are very meaningful as you get to talk with amazing people.
- Might take the first step towards being able to read IL soon.