Rifle housing. The structure to accomodate the rifle was designed using SketchUp.
In order to do this, the rifle was precisely measured (since the replica actually does not comply with the
original rifle design at all). The measurements were taken on a plywood sheet and then a proper mold was realized
to support the subsequent phases of design. Each element of the structure was then exported in a 2D CAD format,
imported in a CAM and realized with a 6 dof CNC machine over a medium-density fibreboard, coated with paper.
Machinery was a fundamental process in order to meet the requirements in terms of accuracy in angular
measurements. Sensors housing, infact, has been placed and realized precisely where we wanted to, avoiding
any out-of-centre related issues. As a final touch, all the components were then painted in gunmetal
gray fashion (RAL 7024), by using
spray can color.
The 2 DoF of the structure (pitch and yaw) are provided by a two arms rotating about
a nylon pivot and a big circular bearing fixed to the base plate, respectively. The nylon pivot was designed and machined with a lathe.
Printed circuit board. Printed circuit boards were designed using Altium Designer 18.0.
Before the design, we have realized a prototype on a perfboard using the wire-wrapping technique.
We realized the first PCB ourselves, using a DIY photolitography followed by chemical etching technique.
PCB was dual layer in through-hole technology, and we have to realize by hand all the vias and the holes
using a drill press. We then have made some changes on the design and the new board was realized by
Printed 3D structure. The printed 3D structure was designed using Tinkercad,
a free online modelling software for beginners very easy to use, it's like to use LEGO bricks. We point out that this solution it's not suitable
for very complex projects, but it was the best choice to make a prototype without strong knowledge in 3D design and printing. An important aspect
was about the resistance and stiffness of the material used to print the structure that has to support the weight of the motors without breaking
or bending, thus avoiding shift of the rotation center. The material that best fit the requirements is the nylon but due to the high cost we have
decided to build the structure using another material. The choices were between either the PLA, that is cheap and very resistent but it is very
sensitive to heat and degradation, or the ABS which is cheap, very strong but flexible.
Due to the heat produced by the motors and the laser we have decided to emply an ABS structure. Aware about the flexibility of the material, we have chosen to fill the
stamp for the 50% and so to avoid bending and to reduce the cost at the same time.
Power supply and PCB housing. In our tests we supplied the system by the mains and the power supply
was chosen to provide two different otuput voltages to supply respectively the motors H-bridges and
the control electronics. To guarantee the functioning in the worst scenarios, the power supply has to provide 60W
and was chosen to be a
flyback isolated power supply.
Power supply and PCB were then embedded into a
44 CE Range watertight junction box.
Measurements. In order to mathematically define the kinematics of the gun barrel,
a certain number of rigid transformations had to be computed. Most of them contains distances
and rotations between some keypoints on the rifle; such quantities have to be measured with
the least uncertainty in order to not compromise the aim. Since it is very difficult to make such
measurements with conventional instruments on the uneven surfaces of the rifle, we adopted a vision
based approach. An high resolution conventional reflex camera with proper optic was employed to realize
a set of quasi-ortographic photos of some regions of interest of the rifle. In this case, it is very
important to take in account and correct the lens distortion and to play down the perspective effects in order to
obtain accurate measurements. Also, a reference distance has to be known with the highest accuracy in order
to obtain metric distances. The software used was ImageJ.
Kinect interface. Interfacing a Windows PC with the Kinect is made possibile by Kinect for Windows SDK 2.0
(although discontinued and very buggy). First tests were made using MATLAB with the Image Acquisition Toolbox to obtain useful results regarding
the performance of the depth sensors and to evaluate the accuracy of the depth measurements. We realized a script that allows the user to pick
a point in the RGB frame obtaining the corresponding depth. This is not trivial at all, since RGB and depth cameras centers do not match and
taking in account for this displacement is mandatory. An introductory document about this can be the following.
Later, a C# program was written using the SDK APIs to implement this fundamental coordinate mapping. This mapping is hard coded into every Kinect
Sensor during the factory calibration and is accessible only by using the SDK provided by Microsoft. Coordinate mapping is a LUT whose dimensions
are the one of the RGB frame (full HD) and the content of every entry is the corresponding pixel on the depth frame. An ad-hoc C# program was later
written to acquire single RGB frames.
Aim assistance: Hardware.
- Raspberry Pi 3 B, equipped with UNIX-like Raspbian 9 Stretch (2018-03-13) distro and interfaced with a Raspberry Pi Camera Module v2
- Microsoft Windows 10.1809 x64 PC interfaced with Microsoft Kinect v2. USB 3.0 interface availability is mandatory.
Aim assistance: Software. Most of the functions implementing the assisted aiming system are written in Python 3.0 with the used of the OpenCV Library.
Both rifle-side and target-side Raspberry Pis are equipped with Python 3.6.2, and OpenCV 3.4.1. Since we have used intensively SIFT
that is free for non-academic use only with the licence from the creator, and therefore this module is no longer included in OpenCV, it must be added to the standard library including the package opencv_contrib. The windows 10 PC is equipped with both Python 3 and OpenCV provided by Anaconda. The IDE we have used is Spyder 3.3.3.
The NTP server is provided by the Windows 10 PC, using Meinberg NTP server. Kinect interface is provided by Kinect for Windows SDK 2.0 (discontinued).
The start at the boot of the main script has been implemented using the "autostart system".
Sensors firmware. To use the fully capabilities of the sensors and to interface them with the Raspberry Pi
we develop a simple firmware in C based on an hand made C library
and the well known WiringPi library, the developed library allow us to acces to the internal
programmable microprocessor-based signal processing features, including temperature compensation and gain/offset trim, as well as advanced output
linearization algorithms, providing an extremely accurate and linear output for both end-of-shaft applications as well as off-axis applications.
EMI. The sensors can be used with different ways of interfacing as: I2C, SPI, SENT, Machester... We have decided to employ a SPI communication
which is a single-ended and used for short-distance. The setup is in full duplex mode with a master-slave architecture with a single master.
Although SPI requires more cables than I2C we have based our choice manily on robustness against disturbance. Since data transmission in SPI
is done on sampled data on the clock's rising/falling edge, instead of a level sensitive data on the clock's active (high) level, this provides
an improved robustness against disturbances over relatively long distances (about 30 cm), in hostile environments.
The cable, infact, runs next to the motor phases, where a highly impulsive current flows.
Moreover, the communication's speed was slowed down to 1 Mbps and we have intensively tested
the link QoS in harsh electromagnetic environments and we never encountered communication fails.