ARUW HuskyBot Announcement

Hey folks!

With an influx of new teams joining the RoboMaster competition over the past year, we at ARUW want to give back to our expanding community. Whether you’re new to RoboMaster or robotics in general, the initial learning curve can be steep—especially with some of the documentation being in Chinese.

Inspired by efforts from projects like EveryBot in FRC, we want to help flatten this learning curve, lowering barriers of entry into robotics. The main goal of HuskyBot is to provide a starter robot code template that contains the necessary boilerplate code to get a robot up and running. We envision this as an opportunity to raise the skill floor of the competition, while providing a starting point for teams to build on and explore

That being said, here’s some of the primary goals of the project:

  • Handle all interfacing with the Referee System
  • Provide logic for game mechanics such as power and heat limiting
  • Fulfills requirements needed to compete in the 1v1 competition

Here’s the planned deliverables:

  • Example robot code
  • A step-by-step guide covering installation, build, flashing, and tuning processes

Now the intention behind this is not to give away the “answers” or “secrets” or even oversimplify RoboMaster’s challenges. We see this as an opportunity to empower new teams to focus on larger problems and more complex technical challenges, rather than struggling to get started in the first place. The last thing we want to see is a robot die from power limiting: we’re here to see some action!

To make this effort as helpful as possible, we’re looking for community input:

  • What does the robot we are coding for look like?
    • Is this using a type A or a type C? The type C has challenges with I/O but a better IMU
    • Will the MCB be on the turret or chassis?
    • Is the chassis using Mecanum or Omni wheels?
    • Will the turret have a slipring? The kit robot provided by DJI does not, is this still a minimum competitive concept if they turret cannot rotate independent of the chassis?
  • What is the presumed skill level of the end user?
    • Should we explain concepts / provide resources for topics such as CAN and PID? If so, what would be useful?
  • What sorts of resources or examples would have helped when you were starting out? I know looking at other teams code bases can be complex and somewhat scary, I get confused looking at my own sometimes.
  • Are we using taproot? (Jk, the answer is yes)
  • Do we include a basic CV system? Hehe, things are being cooked here

The expected timeline of this project aims to have a majority of the code base completed by competition, with the accompanying guides and resources finalized by the start of the next school year. That being said, keep in mind that the primary developers still have classes, internships, and their own code projects to manage. Progress may vary, and there might be delays around these proposed deadlines, but we’re committed to making this resource as robust and helpful as possible! On that note, we would love to have this be a community project, so if this seems like a fun project to you, please get in contact with us: we’d love the opportunity to collaborate!

- Sumedh Panatula, ARUW Software Controls Lead

Something to consider might be supporting a dev board that is not a type a or type c, to make sourcing easier for new teams. If you wanted to stay in the F407 family, you could try something like the STM32F4DISCOVERY, which you can buy direct from ST for $20.

While it’s an interesting avenue, the goal at the moment is to provide support for commonly used hardware and utilize Taproot’s functionality. I know UCSD uses a Nucleo and CU uses a teensy, but my understanding is that most teams use a dev board and it’s predominantly common for beginner teams. For example, I helped UMN a little last year around, and they had ordered the kit bot and used a type C for their development

To add to what Sumedh said about this, taproot is designed with the Type A/C Boards in mind, but with a little bit of work, it can be used on any STM board. While I’m not sure of the exact process for this, its definitely a possibility for teams who use other boards.

I have briefly touched on this in other discussions, but the concept of HuskyBot as a whole is a much needed item in the community. Far too often have we seen new teams kinda fall flat in software development for their first year because the ecosystem is notoriously difficult for first year teams.

Assuming a bare-bones, “plug and play” system, consider the following questions for yourself as a developer:

  • What features should be provided out the gate for functionality
  • What features should be left for the reader to discover
  • What kinds of controllers should be implemented? Should new teams be given beyblade/spintop/spin2win right out the gate? Or should they have the challenge of trying to implement it? Even if their learning it just yoinking it from another repo

Receiving data from the game server should be a given. I would wager that with the potential upcoming changes to power monitoring, to implement basic CAN/I2C/SPI communication to an external breakout for current monitors (Hall effect or in-line).


If I were to embark on such a project, I would approach it from the facet of “how can I provide enough functionality that meets the competition skill floor as of right now, while also providing enough information and resources for teams to do more advanced stuff (ie. auto-aim, sensor fusion, field odometry, particle filters).”

To extend this further, how much info additional information outside of software do you want to provide? Should there be anything about mechanical design? Electrical hardware design? Teams often fail in the actual integration between hardware, mechanical, and software. I have seen some absolutely atrocious wiring.

Consistently improving the competition skill floor is what we strive to do, but you have the challenge of trying to help teams learn the concepts of robotics and control systems in such a difficult environment without physically being there to hold their hand (not that you should anyways). It’s a tough task, but I am eager to see where it goes

1 Like

Technical Aspects

To answer some of the questions you had about functionality

  • Use the Type C board. Type A’s are no longer sold iirc and Type C’s can be built in with communication breakout boards. You are definitely right about I/O though; the lack of exposed UART is tough.
  • If you elect to put the MCB on the turret, slip ring is a must (obviously). Otherwise, chassis. Rudimentary odometry can be done a little easier through the IMU if you put the MCB on the chassis
  • Mecanum or Omni is your choice. Both have their inherent advantages and disadvantages
  • Honestly I would advise for a slip-ring on the robot regardless. You can get cheap Senring sliprings from Ali-express that have the current capcacity and number of wires for a basic turret

Skill Level

  • I didn’t do FRC/FTC which seems to be a bulk of the students in RM, so I’ll let others answer this more thoroughly. Resources/examples of good controllers would be helpful though. Generic close-loop controllers don’t function well for the system dynamics seen in RM. Cascade/Feed-Forward make life significantly better for a driver, but I could also see these be teaching moments for learning control systems

Resources

  • Give them everything you got lol (within reason). These resources can be basic examples, lecture slides, general RM information: up to y’all. Just keep the skill floor in mind

CV

  • I think including a basic CV system is great! To go off that, I would wager you even provide a basic dataset (500-1000) images captured through a RealSense or some other vision camera as well as a test video. Providing model weights for basic robot detection and position would be great too. It’s really no different than trying to do position detection of April Tags using OpenCV: your bounding box is just different
  • This in mind, be wary of the resources you provide for machine learning. There are a lot of good resources but even more bad resources. Whatever is provided here should be launch points into development for an actual auto-aim system

General Thoughts

  • Providing a list of features and their general difficulty is helpful for teams to formulate future milestones outside of their first year
  • The more difficult task (and one I highly encourage) is providing introspective questions to the reader such as
    • "I want my robot to auto-nav the field. Why do I want to auto-nav the field? What are the advantages and disadvantages of this? What information do we have on auto-nav algorithms? What is required to implement a feature like this? Do we have the pre-requisite odometry for this to work?

Teams often get caught in the feature grind without asking the question of why they should do it. It’s easy to say “lets work on auto-aim” without really having a strong breakdown of how to build auto-aim. Encourage feature development, but emphasize the necessity to break down feature development into digestible tasks

I’ll be putting together a planned software design as the next update to this, I plan to go in depth more about the planned subsystems, their architecture, and their functionality. In terms of resources outside of software, I myself am not knowledgeable in those aspects. I know that there are the open source mechanical designs from the Chinese teams, but if other members of our community have an interest and are willing to make such guides, I’m more than willing to include them.

Hey folks!

Thank you all for the feedback and responses on our initial post! Based on the input we received, we’ve decided to use the starter bot (RoboMaster Standard A Unassembled Kit in offline ordering) as the foundation for HuskyBot. This decision was guided by it’s value proposition: assembling a custom bot with the required hardware and referee system costs ~$3500, but the starter kit provides most of the essentials for ~$1200. This significantly lowers the financial barrier of entry for new teams, and we see it as a strong starting point for our efforts.

Chassis

  • Configuration: 4-Wheel Mecanum
  • Hardware: 4x M3508 motors with C620 ESCs (included in the starter kit)

Launcher

  • Configuration: 2-Wheel Flywheel System
  • Hardware: 2x M3508 motors with C620 ESCs
    • Note: The kit comes with snail motors, but we recommend using M3508s

Agitator

  • Configuration: Above the turret for easier ball pathing
  • Hardware: 1x M2006 motor with C610 ESC (included in the starter kit)

Turret

  • Configuration: Each motor is placed in-line with a rotational axis
  • Hardware: 2x GM6020 motors (included in the starter kit)

Additional Components

Slip Ring:
To enable continuous turret rotation, we strongly recommend adding a slip ring. Non-slip-ring configurations are not considered viable for competitive play, and we plan to incorporate this functionality into the code.

Type A Development Board:
Before you grab your pitchforks, hear us out: while the Type C board has a better IMU, its lack of I/O makes it less viable for the goals of this project.

  • Primarily, having no analog inputs and only 1 serial port (after connecting the referee system and remote) significantly hinders expansion.
  • Furthermore, with DJI’s plans to remove current and voltage information from the referee system, we see an increasing need for external sensors.

The Type A board has the necessary I/O for our design and will be mounted on the turret for turret-side control. While this compromises on IMU quality, we will address solutions for this later.

Current and Voltage Sensors:
We are still exploring models, with the ACS712 as a likely candidate for current sensing. These sensors will be mounted on the chassis and wired through the slip ring to interface with the MCB.

LED Indicator:
One annoying thing about calibrating your IMU is that the field crew doesn’t know when it’s over. To assist field crew, we’ll include an LED that lights up once calibration is complete. This provides clear visual feedback, as the built-in buzzer is often hard to hear.

Flywheels:
The flywheels included in the kit are incompatible with M3508 motors. Taobao sells ones that basically every RM team uses, a link to these is posted in the discord and will be copied over.

Vision System:
For teams interested in computer vision, we recommend the NVIDIA Jetson Orin Nano Super Developer Kit (NJONSDK) paired with the OV9782 camera module. This combination offers a powerful vision system at around $300. While the NJONSDK is slightly more expensive than alternatives like the Raspberry Pi or Orange Pi, it provides better software support and a higher computational capacity. This combination enables a vision system capable of running at up to 100 FPS.

  • In preliminary testing using an Orin NX with Jetpack 5.1, we achieved model inference times of approximately 3 ms (or 1.5 ms with INT8 quantization), leaving plenty of room for post-processing tasks. For comparison: the Orange Pi 5 achieved 28 ms on its CPU and 8–10 ms on its NPU, while a Ryzen 4500U laptop achieved 10-15 ms. The software is expected to be platform-agnostic (as long as it supports Python), but this is an important factor to keep in mind.

For beginners, implementing a vision system may not be a top priority. However, if your team has the resources and interest, we encourage exploring CV early on—it’s a challenging yet highly rewarding experience.

This framework aims to provide teams with a solid foundation to build on. We’re excited to see how each team takes this base and expands it to meet their own goals. As always, we’re open to feedback and eager to hear your thoughts—feel free to respond, reach out, or share your ideas in the Discord!

Sumedh Panatula
ARUW Software Controls Lead