Manage 12 to 20 3D printers with Repetier server.. best way ?

Hi there, 

All is in the title , 
What is the best way to manage 12  to 20 3D printers with repetier server ? ( 3 different models of 3D printers so they will share between them gcodes) 
Webcam is optional here but it would be great !

So if i go with some rpis i will be able to manage  2 printers with 2 cams per rpi (or 4 printers without cams), but the question is : would i be  able to see all the rpi and printers (repetier server instances?)  in the same "Home" (dashboard) ?

Or do  i go with a small PC and linux and run all the 3D printers from it ? and possibly one webcams per 2  printers ? if it is possible what would be the optimal config of it ? 

Thanks for the help ! 



Comments

  • At the moment there is no merging of server instances. So each server has it's own page. With a beefy pc you might be able to run all, but I do not think that it is clever. If you have that many they will be in use often and when you then need to restart for some reason like update or whatever you need to stop all printers or wait for all to finish. My advice would be to use one master pc with linux. There you store your projects. Each instance can access projects of that master pc and also outsource cpu intensive computations to that like rendering images. So you upload only to one instance and can use from any instance. Also it solves the cable length problem. With 20 on one pc you will need longer usb cables and that often causes bad communications. So a few satellites being closer to the printer is also a better solution here.

    Then bundle 2-4 printers per pi for example. Use a powered usb hub so you can also have 4 webcams, just do not go crazy with resolution and connect everything with ethernet cables. wlan is quite slow so with 4 printers and webcam you want a good connection to your network. And PLEASE make sure to use a good 5.1-5.2V power source. Raspberries are sensitive to voltage drops on load and might disconnect usb and hence printers. With a good power unit it will not happen.

    Bundle identical printer types if possible. Per node you can also share the gcode models between printers, so you upload to one and all can see and start the job.
  • Thank you for your response and advices, 

    So i think that for reliability purpose we will not use webcams, we don't really need them. 
    I will bundle 4  printers (same model as advised and it makes sense) per rpi and  with a master PC to share gcodes. 
    Does a RPI makes sense for the master PC ? or something a bit more powerful to outsource intensive CPU tasks is preferable ?

    Also I just have a little questions about how works repetier server with the direct print mode from the master PC: 
    Does this fully load the gcode on the printer before printing it ?
    In the case the server disconnect during the print and the gcode is fully loaded on the printer i assume it will not be a problem and the printer(s) will keep printing. 
    Also does repetier server handle filament sensors ? 

    Concerning the pro version and the ability to customise the frontend i only saw color or logo customisation and i am wondering if i will be able to add some simple links / button in each instance's dashboard or even a drowpdown in the menu bar to navigate from one instance to another ? (I have a little web development background) 

    Thank you ! 


  • Master should be bigger then the slaves or outsourcing cpu tasks makes no sense. I mean rendering a gcode image takes 3-6 minutes depending on size on a pi. My mac does this in 10-15 seconds. That is where it makes sense. Also if you use it as project archive you want something more secure then a sd card to store data.

    You can host projects on a master and they can contain gcodes, but for the print they get copied and analysed by the receiving pi. So network interruptions after print start are no problem here.

    Customizations are more for printer vendors at that level. You would need to redo customizations  every update. Exception is if you use the modules like described here:
    https://www.repetier-server.com/manuals/programming/API/modules.html

    these allow adding links/windows. But easiest way might be to have a frameset to navigate through the devices and call target in lower set.
  • Thanks, 

    So i am trying to setup the repetier server network, 

    I use a mac book pro late 2011 as the master PC, for testing purpose there is only one pi directly connected on the macbook pro ethernet port. 
    I am planning to use a multi-ethernet port connected to this mac to have all the pis connected to it. I would love to connect all the device on the router, but it is too far away from our printing room. so the mac is Connected via wifi and the pis connected to the mac via ethernet. 

    I can access the pi's repetier server from the mac connected to it thanks to internet sharing through, but i can't access it from another computer on the same WLAN network. (And  i can access to the master server from any computer in the network)
    My network knowledge are bit limited so i am really wondering how i can access the satellites servers from another computer than the master one. 

    Also last but not least, i try to access a shared folder of the master mac from the pi but i don't find any solution to do it. (i have google it) 
    Do you have any insight how can i setup that ? 

    As soon as i achieve to test that i think i will go with the pro version to setup the outsource cpu usage to the macbook, but this seems easier than the rest of the setup ;)



  • Your problem is you have 2 networks. The pis with mac and the wifi network. The only device knowing about both is the mac book, so that is why it can do anything. The wifi network only sees the mac.
    Easiest solution might be to add nat to the mac so it offers the other pis under a port on the same ip as the wifi of the mac. No idea how to do this on a mac, but I guess google would help here. Alternatively make the ethernet part transparently visible in same network, but I never did this so also google.

    The meshup of the pis with mac happens in the global settings->connectivity tab. On the pis add alternative servers as described in the manual. Here add the mac with it's ip and api key. Now all pis can select the projects from the mac as well as their local project folder. This requires pro version or active test period.
  • Thanks for the insights, 

    I have finally find a way to do almost exactly what we need. 

    For the people who might be interested here is exactly what i have done : 

    So i use a router (an old linksys wrt54g) that i have flashed with DD-WRT firmware, 
    This allow me to use my router as client bridge and connect it to the internet over wifi. (do not use subnets for your client router) 

    So the router is connected wireless on my AP and shares internet to all the device connected on the client router's ethernet ports:  my master PC (macbook pro on wan port- just assign wan to switch in the FW)  and all the  pis.

    So all the device even the ones after the client router are on the same network and accessible over wifi. 
    The main purpose here is to drop files over wifi to the master PC. 

    Just be sure that you have static IP enable on your AP, as we use DHCP directly from the AP. 

    The only thing i have not managed to do is to add a folder from the master PC in the pis. I am not even sure if it is possible with shared folder via afp or smb...

    But still i can use projects from the master pc on the pis ! 





  • No, you can not have a synced folder in server. Server would not see the changes and that would lead to serious problems sooner or later. Only sharing within one instance is supported as the server then knows about the connection.
  • I'm actually working a very similar project, where I plan to deploy a custom frontend to manage 4-5 instances of Server on individual Pi's.  Much of this discussion is very useful, but I don't understand the file handling going on here.  I've gotten a handle on most of the API but I struggle to grasp how the API handles directory structure and file access, and I've so far only been able to execute prints if they are already stored to the model in the server.

    @p_vnct you seem to have solve a problem that eludes me.  How are you accomplishing accessing projects from the master on the Pi's?.
  • The trick is projects function. Tell slaves to use master Projects. They will still have their own but you can select the master server as well and then they will download from master server.

    Uploading gcodes is not done with websockets. It is a simple html post request with file. Session is added as extra header. Best is to check with our frontend in debug console->Network. Upload a gcode and you will see the headers and data that get send.

    Same also works for websockets in chrome. Selecting the websocket in network allows you to see all the frames getting send and received. So if something is unclear or does not work just call the function and see what gets send.
  • @rpfaff Yes with the pro version in the settings under connectivity tabs you can add an alternative server for each pis - just set it to your master. 

    But i finally i don't use projects to print files, as each time you will print a file from a project it will run the gcode render.. So not really optimal when you launch 4 of 5 prints at the same time , the pi can't handle those simultaneous tasks and print flawlessly.
  • Thanks for the input both of you.  Very helpful.
  • edited June 2019
    @p_vnct Do you mind sharing details please: how are you currently handling Gcode file for each printer? Thank you in advance.
Sign In or Register to comment.