Alternative ringing tower print for input shaping calibration

Hi,

The current ringing_tower test isn’t too bad for input shaping calibration, but it has 1 severe defect: for each tested axis it, technically, changes the velocity for both axes, because it has diagonal moves that are pretty much impossible to avoid:

As a result, both axis can start to vibrate after the notch. If one is lucky, the vibrations of the axis that follows the motion of the toolhead are small enough, then perpendicular vibrations of the other axis will be most visible, and one can measure its resonance frequency. However, if the vibrations of the collinear axis are too large, e.g. if that axis is much heavier than the other one, then the vibrations may interfere with the vibrations of the axis of interest, leading to irregular waves on the side of the print that are hard to measure. Here one can see that there are some parasitic waves after the notch, and the period of the waves seems to change after some time:

Looking at the accelerometer data (obtained using motan tools):


one can see that even though this is the side of the test for X axis, Y axis actually resonates more and has higher amplitude of vibrations, and some vibrations are also present on Z axis.

In order to overcome this defect, I tried to create an alternative test that would affect one, the tested axis a time. Unfortunately, making such test required direct GCode generation. Here’s what I came up with:


In a nutshell, the test first accelerates both axes during a diagonal move, but then decelerates only one - tested axis - to a complete stop, while the other axis maintains its original speed, and the vibrations are excited only on one axis at a time:

The test has the embossed axes letters on the sides where the measurements are supposed to be performed. They can also be used to estimate the magnitude of smoothing after the input shaper has been calibrated, as they become less crisp with the increase of acceleration:

With the new test, I managed to more reliably calibrate the input shaper by measuring the distances and the number of waves in several places of the test and taking the averages:

axis N D, mm F, hz
Y 3 5.7 42.11
Y 4 7.5 42.67
Y 4 7.35 43.54
Y 6 11 43.64
Y 4 7.3 43.84
Y 43.2
X 3 5.65 42.48
X 4 6.9 46.38
X 5 8.5 47.06
X 4 7.25 44.14
X 3 5.2 46.15
X 4 7.4 43.24
X 44.9

These results were not in 100% correspondence with the data from the accelerometer calibration:


but this is likely because I have several resonances on each axis. And they were still quite good with EI input shaper.

At least, these were the results on my Ender 3 Pro, but it may work differently on, say, CoreXY or Delta printers. I also understand that the accelerometer-based calibration may be more robust. However, even then it is good to have an ability to confirm the results of the calibration. And some may not wish to purchase an accelerometer, and they may still benefit from the print-based test. So, if you want to give it a try, I made another module similar to a new proposed PA test available in the branch GitHub - dmbutyugin/klipper at pa-tuning-script (and also in this and this branch).

In order to enable it, I suggest to add the following snippet to the printer.cfg:

[ringing_test]

[delayed_gcode start_ringing_test]
gcode:
    {% set vars = printer["gcode_macro RUN_RINGING_TEST"] %}
    ; Add your start GCode here, for example:
    G28
    M190 S{vars.bed_temp}
    M109 S{vars.hotend_temp}
    M106 S255
    {% set flow_percent = vars.flow_rate|float * 100.0 %}
    {% if flow_percent > 0 %}
    M221 S{flow_percent}
    {% endif %}
    {% set layer_height = vars.nozzle * 0.5 %}
    {% set first_layer_height = layer_height * 1.25 %}
    PRINT_RINGING_TOWER {vars.rawparams} LAYER_HEIGHT={layer_height} FIRST_LAYER_HEIGHT={first_layer_height} FINAL_GCODE_ID=end_ringing_test

[delayed_gcode end_ringing_test]
gcode:
    ; Add your end GCode here, for example:
    M104 S0 ; turn off temperature
    M140 S0 ; turn off heatbed
    M107 ; turn off fan
    G91 ; relative positioning
    G1 Z5 ; raise Z
    G90 ; absolute positioning
    G1 X0 Y200 ; present print
    M84 ; disable steppers
    RESTORE_GCODE_STATE NAME=RINGING_TEST_STATE

[gcode_macro RUN_RINGING_TEST]
variable_bed_temp: -1
variable_hotend_temp: -1
variable_nozzle: -1
variable_flow_rate: -1
variable_rawparams: ''
gcode:
    # Fail early if the required parameters are not provided
    {% if params.NOZZLE is not defined %}
    {action_raise_error('NOZZLE= parameter must be provided')}
    {% endif %}
    {% if params.TARGET_TEMP is not defined %}
    {action_raise_error('TARGET_TEMP= parameter must be provided')}
    {% endif %}
    SET_GCODE_VARIABLE MACRO=RUN_RINGING_TEST VARIABLE=bed_temp VALUE={params.BED_TEMP|default(60)}
    SET_GCODE_VARIABLE MACRO=RUN_RINGING_TEST VARIABLE=hotend_temp VALUE={params.TARGET_TEMP}
    SET_GCODE_VARIABLE MACRO=RUN_RINGING_TEST VARIABLE=nozzle VALUE={params.NOZZLE}
    SET_GCODE_VARIABLE MACRO=RUN_RINGING_TEST VARIABLE=flow_rate VALUE={params.FLOW_RATE|default(-1)}
    SET_GCODE_VARIABLE MACRO=RUN_RINGING_TEST VARIABLE=rawparams VALUE="'{rawparams}'"
    SAVE_GCODE_STATE NAME=RINGING_TEST_STATE
    UPDATE_DELAYED_GCODE ID=start_ringing_test DURATION=0.01

Then the ringing tower test can be triggered via a command like RUN_RINGING_TEST NOZZLE=0.4 TARGET_TEMP=210 BED_TEMP=55.

[ringing_tower] module has the following parameters that can be altered (and these are their default values):

[ringing_tower]
# Interesting parameters that may require adjustment
size: 100
height: 60
band: 5
perimeters: 2
velocity: 80
brim_velocity: 30
accel_start: 1500  # the acceleration of the start of the test
accel_step: 500  # the increment of the acceleration every `band` mm
layer_height: 0.2
first_layer_height: 0.2
filament_diameter: 1.75
# Parameters that are computed automatically, but may be adjusted if necessary
center_x: ...  # Center of the bed by default (if detected correctly)
center_y: ...  # Center of the bed by default (if detected correctly)
brim_width: ... # computed based on the model size, but may be increased
# Parameters that are better left at their default values
# notch: 1  # size of the notch in mm
# notch_offset: ... # 0.275 * size by default
# deceleration_points: 100
3 Likes

And one small addition:

[ringing_tower]
velocity: 80

is the velocity one must use as V in a formula V * N / D when calculating the resonance frequency. N and D are the number of oscillations and the distance between them as usual:

1 Like

Very interesting. Seems a clever way to inject resonances into a test print.

As high-level feedback, I’m leery of adding a “mini slicer” to the master Klipper branch. My fear is that it could become a long-term support/development burden, as well as a configuration burden for users.

I understand the value in directly generating a print from within Klipper - it enables precise generation of extrusion moves. It also removes potential variance that different external slicers may introduce.

However, many printers require detailed startup sequences to print effectively (bed temperature, nozzle temperature, chamber temperature, bed levelling, gantry levelling, nozzle wiping, brims, skirts, rafts, different first layer speeds, different first layer heights, different first layer temperatures, fan speeds, led lighting, etc.). I fear that starting down the path of a “mini slicer” will invite significant future complexity.

A user has to be able to configure and use an external slicer to effectively use a 3d-printer today. (I could talk for hours on how needlessly over-complicated I think modern slicers are, but ultimately the reality is that users need to be able to learn and use one.) I fear adding a “mini-slicer” to Klipper will increase the configuration burden for users, as I fear they will ultimately need to learn and configure both a “real slicer” and the “mini slicer”.

So, I think the test is interesting and I look forward to seeing test results. From a “high-level project direction”, I think this approach has some notable risks.

Cheers,
-Kevin

1 Like

Maybe for the more experienced users you could add that and instruct to use start_print and end_print macros.
So one can further optimize the printer.

TBH, in case of other test generators I have also seen countless, often contradicting requests to add different features/options (e.g. heat up bed and hotend simultaneously to speed up the startup sequence, heat up bed and hotend sequentially to avoid problems on low-spec PSUs, add bed meshing calls, add [firmware dependent] pressure advance configuration, etc.). So, @koconnor, if you are worried about this type of challenges, I fully agree with you. However, notably this ringing tower test generator (and the other PA test generator) is actually very minimal: it generates only the GCode commands essential for printing the tower (and, well, its brim). I expect the rest of the code to be added by the users to their start_ringing_test / start_ringing_test gcode sequences up to their liking. They can configure the appropriate heating, homing and bed meshing sequences there and whatever else they need (e.g. enabling fan, additional purge lines, pressure advance setup, adjusting the flow rate, etc.)

And so I’m looking forward to the feedback from various users. If we find that this covers like 99% of the needs of different users, I think that’s good enough and we can keep it that way. If not - well, I do not know. It is not difficult to generate a 3D shape as per test and export it to an STL file with sufficient precision. However, modern slicers lack the tools needed to generate the required velocity profile when printing the perimeters of the model. So, unfortunately, it is a dead end. That said, Prusa Slicer and SuperSlicer have some built-in calibration generators, so it might be a last-resort option. Although, that would force users into using a specific slicer, and I am also not very familiar with that calibration generators and what is actually adjustable vs what isn’t, so it might still not work.

Kevin, it doesn’t need to be mainline.
Start a second repository with user-contributed scripts/tools, and with links to other related tools for the bigger one which are better picked each time from their original repository.

2 Likes

I think exactly such features are the “unique selling point” of Klipper. Let’s take the CAN implementation as example:

  • Does it add code complexity? Most likely yes
  • Does it add user complexity? Definitively
  • Does it add a support burden? Yes
  • Is it a great feature and worth having all above? Yes!

I dare not judging any code complexity / maintenance burden things but for the user complexity, I think Klipper should be more freely accepting this in favor of new features or flexibility.
I also honestly do not think that Klipper does any worse compared to the other big firmwares (Marlin, RRF) and I used all 3 of them.

  • Each of them has some entry hurdles
  • Marlin is just perceived less complex because the majority is using precompiled firmwares and never bother to build from source or change options. Defining a printer from scratch in Marlin or syncing configuration between different releases is a major pain
  • RRF has a completely different philosophy and equally needs quite a learning curve to get started
  • Klipper IMO is doing much better on so many aspects, e.g. documentation, support etc.

Overall I think that Klipper would profit from being more open to contributions, even if they would mean additional complexity.

As for the gcode generator proposed here, I think it would be a useful addition, especially on the PA side.
Currently I’m using @dmbutyugin proposed PA test tower and found it very useful. Easy to setup, quick to print and more easy to interpret than the traditional one.

1 Like

Thanks for the feedback. I agree there is a difficult balance between managing features and managing long term maintenance. I wish I knew of a better way to convey and manage that balance.

To be clear, I have not made a judgement call on the proposal in this thread. My comments earlier were intended to relay high-level thoughts on where I felt this feature lies on a hypothetical “complexity vs benefits curve”. I do feel this proposal would add notable long-term developer and user facing complexity, which should be taken into consideration.

Cheers,
-Kevin

I’m afraid it might be not that simple. This code, in particular, requires some changes in the virtual sdcard code, such that the user can pause and cancel the test mid-print. In general, there’s always a risk of two repositories divergence and that the one with the user-provided content will break and stop working. Having repositories for macros is one thing, but having a code that uses Klipper internal API in a separate repository may be fragile.

And separately, there’s another question, if there is a much better alternative, why suggest a subpar option in the main Klipper documentation instead? I personally think that Klipper should promote good 3D printing practices (well, at least, not promote worse practices). Now the question is, do either of the new tests (for input shaping and PA) provide enough user value and benefits to replace the currently suggested ones or not? I think only the user feedback can tell. However, it should probably not be in the form ‘X is better than Y in my opinion’, because we all have different opinions, but more like, X gives a result A, and Y gives a result B, and B is objectively better (i.e. prints look better or have fewer defects). So, if you find value in either of the newly proposed tests, I’d encourage you to share your results.