First stages in implementing the waypoint scanning behaviour in waypoints. The idea is when a waypoint is reached, and this waypoint instructs the robot to scan, the robot picks up the waypoint’s scan points. Any number of ‘scan points’ can be specified by parenting transforms under the main scan point object of that waypoint.
The scanning movement is handled by the RobotMovementClass, not the RobotGameClass, though I’m still feeling it out whether that’s the best solution.
On another note, I’ve modified the Game Pause/Unpause behaviour by using SendMessage to NavMeshAgents instead of using if() conditions on their Update() callback; this is mainly because NavMeshAgent functions like Resume() are unintentionally getting called when they’re supposed to sit still!
Been doing some modelling over the past few days as a break from coding. Actually, I didn’t really want to break from coding, but I felt that I wanted to go back to visualise the game again.
I have to admit that I don’t have a strong theme; I do have a certain style I know I want to hit, but the define ‘style’ is more specific than talking about theme. The game is set it some futuristic landscape but I still have this Bladerunner/Strange Days/Neuromancer grungy cyberpunk idea that keeps me from getting too slick, hence the hoodie and trenchcoat. But the cute little characters provide a contrast that I’m not sure is good or bad.
This class was a functionality that I deferred when I was creating the robot’s ‘game’ class. Originally, I had simply used a Raycast to determine LOS; I knew a scanning behaviour was needed, so I coded the robot generically in that respect.
The ScannerClass is attached to a scanning object that is parented under the the robot’s root transform. The behaviour is that it scans around (oscillating pattern or full revolution) using the parent transform as its base direction. The class itself rotates the scanner game object for (my) convenience.
When the RobotGameClass asks ScannerClass for LOS to a specified target (usually the target that attacked it), the scanner will determine at that time if the target had been found; if so, it locks on to it. The RobotGameClass, meanwhile, instructs the RobotMovementClass to face and move towards target, making the the scanner face the target naturally. If the target gets lost out of sight, the lock is broken by the ScannerClass itself and it goes back to scanning, and based on the RobotGameClass AI, it will either reacquire or not.
What I learned from the coding the ScannerClass was more awareness of how rotations — expressed in Quaternions — and vectors are translated to and from each other. Because the scanner is controlled by a ‘scan angle’, which is a float, I needed to make a procedure to translate target vectors to Quaternion rotations to get to the Euler angles. It sounds a bit convoluted, but I think to keep the simplicity of the driving values, which is where the source of the confusion often is, the messy bit had be sorted out that way.