Operation Honortech



MISSION
Bring forth awesomeness using free or open source software.




Starting out on the filmmaking path, one of the hurdles that every filmmaker goes through is the technical hiccups which usually happen during Post-Production. In order to achieve certain effects or even edit the film, a filmmaker will be needing a software to do the task. And those software, usually cost a fortune.

Not many filmmakers have enough budget to shoot their film let alone buy these software so some attempt to acquire these software through trial versions or torrents. But as what most of us know, using trials is a pain due to its limits and torrented ones are illegal. Especially if you're using them commercially.

So to avoid using trial versions and torrented softwares for commercial use. I initiated, Operation Honortech.




These are the free and open source products that I will use for this mission:






So far, Ubuntu Studio is the best free/open source Operating System, for me. The interface looks great and since it's an OS specialized for "creative humans", it's optimized to handle multimedia tasks from editing a video to making music. I think that it would be the best operating system to use for this operation.

Check it out and know more about it here.





Kdenlive, is the best free/open source video editing software that I've used so far. It's light(judging to the fact that it doesn't lag too much as I use it on my laptop) and it's the closest to Adobe Premiere in terms of how the interface looks. I can, pretty much, edit videos with it like how I edit videos on Premiere. 


Check it out here!






Among all of the free/open source software, Blender is my favorite. I was about to give up on my dreams, but Blender changed my mind. It's a free 3D animation Suite. This is my alternative to After Effects, and so far, it's awesome!

Check it out here!






GIMP, I think, is the best alternative for Photoshop. Among the free/open source software, I've been using this one the longest. Made digital artworks, edited photos and made most of the images in this blog using this guy.

Know more about it here.



Krita is the best for digital painting! I'd still use, GIMP for normal photo manipulation or editing, but Krita is going to be my main software in making digital arts. I think it's the best FREE software specialized for digital painting.

Just check it out here.





I use Audacity to process my recordings using my phone. It cleans the audio pretty well too.

Check it out here.


For Music production, I'll be using LMMS. It's the closest thing to FL Studio.

Check it out here.






And I'll be using Ardour to apply sound to my animations. Or mix the music I make. :)

Check it out here.






For screen recording,
I'll be using SimpleScreenRecorder
so I can make tutorials and timelapse videos.

Check it out here.







SO YEAH!

Since 2016, I have been using these software in making videos.
Just head over to my channel and see the results. :)




Character Model View and Download

   Doodle Notes Original Characters   

   DOODLY   
"Our awesome mascot!"




   Fan Mode Characters   


   OLAF   
Download here!

   BAYMAX   
Download here!




   SCRAT   
Download here!




   MEOWTH   
Download here!

   TURBO   
Download here!




Quick Camera Tracking Tutorial in Blender



This is a written, more detailed version of the Quick Camera tracking tutorial in Blender.
For those, who found the video too fast. 😄😄😄

QUICK CAMERA TRACKING IN BLENDER


This is how I tracked the camera on "When you hum the Game of Thrones Theme" AVAV.

STEP 1. PREPARE OR SHOOT THE FOOTAGE


If you don't have a camera or just want to practice camera tracking without the hassle of shooting,
you can download the footage I used on the AVAV here:


            But if you want to shoot your own footage, these are the things you need to take note of:

Camera movement
It should be minimal. Which means, no shakes, no major changes in camera angle, 
movement not too fast and no zoom ins and outs.

List down your camera settings
Focal Length
Sensor Size
Frame Rate



STEP 2. START CAMERA TRACKING



Open up Blender and change the layout to Motion Tracking.



Open the footage you want camera tracked.





Change the frame rate in Blender
equal to the frame rate of the raw footage.
(29.25 if you used the raw footage provided above)

This is important because if not set right, 
despite the track appearing fine during tracking,
it messes up the track after solving.






Pick a frame from the raw  footage where the CGI element will be mostly on.
(In the footage provided, pick the last frame)
This is just to ensure that the track is more accurate around the frames that matter. :)


After picking a frame, press "Detect Feature" on the Track panel.
This will automatically plot out track points on the frame.


To increase the track points, go to the Detect Features options below the track panel
and change the Threshold and Distance to lower values.
(In the footage above, I changed the Threshold to 0.150 and the Distance to 70)


If you're satisfied with the number of track points, go back to the Track panel and start the track.
(Since we picked the last frame from the footage above, we track backwards.
Track forwards if you picked the first frame
or if you pick a frame on the middle of the footage, track forwards and backwards from that frame
)


After the tracking is done, expand the Graph Editor to clearly see the curves representation of each track.


Examine the graph and look for a curve that is very different from the rest,
select it and then delete it by pressing "X" and selecting "Delete Curve".


Examine the graph for more stray curves and delete them.
These curves are easily distinguishable because their difference will be very obvious.
Actually, these are tracks that went way off from the  point they're supposed to be tracking.
That's why they need to be deleted: To avoid giving Blender false information about the track.

After cleaning up the track, head over to the panel at the right side of the footage
and look for the camera and lens settings.
This is the part where we'll need the Sensor Size and the Focal Length of your camera.
If you don't know what it was, you can use the camera presets provided by Blender(see image above)
or try looking for your camera settings through Google.
I can't really tell you how else to look for it unless your using your phone as your camera(which I did):


To determine the Focal Length and Sensor Size of my phone camera, I used an app called Phone Tester.
There are a lot of apps like this one, which tells you all the specs of your phone.
Just look for them and download for free from your app store.


Set the focal length and sensor size equal to the camera's.
You'll only be needing the width for the sensor size.
(For the footage above, the Focal length is 2.94 and the Sensor Width is 3.6)


After setting up the camera settings,  go to the Solve Panel and press "Solve Camera Motion".

This process will be done in a few milliseconds.
It will give you a Solve Error information afterwards.
For best results, the solve error should be less than 0.2.

If the Solve Error is higher than 0.2,
I found that the best work around for it is to play with the settings on the Solve Panel. In short, trial and error.
On the Solve Panel, just set Keyframes A & B to frames that have some significant changes between them.
For example, in the footage used above, between frames 70 and 140,
the camera tilted from a higher angle to a lower angle.

You can also tell Blender to refine some camera settings.
(For the footage used, refining K1 and K2 did the trick)

Don't forget to press the Solve Camera Motion button each time you make changes.
You will see that after making some changes on the Solve panel settings, the solve error will drop from high to low.
Remember to make sure that the solve error is lower than 0.2.

If the solve error is lower than 0.2, you can now set the Orientation,
just select THREE track points on the floor in the footage, go back to the Solve Panel
and under the Orientation option, press "Floor".

To set the scale, select TWO track points, preferably, that are close to each other.
And press "Set Scale".

Select ONE track point and press "Set Origin" to set the center of the 3D view.
And you can also set the X-axis, if you want to.

You can preview the Orientation from the 3D view on the upper corner of the Window.

If you're satisfied with the Orientation, just go back to the Solve Panel,
head over the Scene Setup option and press "Setup Tracking Scene".

Blender will then automatically set things up on the 3D viewport
(Set the raw footage as background image, add track points as empty and applying the camera movement)
and will also automatically create a Compositing Node Setup.
You can then preview and make more changes in the viewport.


AND THAT'S IT!

That's how I tracked and plan to track my cameras for my AVAV videos. If I need to do it quickly. :)

If you have any questions, don't hesitate to ask them on the comments below!