Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Show HN: No-code Browser Automation. RPA so non-coders can automate work
11 points by yaseer on April 8, 2020 | hide | past | favorite | 7 comments
Hi all,

We built a no-code way to automate work in your browser.

It’s in beta, and we’ve just made it public today:

https://axiom.ai/

https://www.producthunt.com/posts/axiom-browser-automation

    What?
RPA (Robotic Process Automation) is a way to automate using the user-interface of applications, rather than APIs.

It’s been really popular in the enterprise for automating big processes.

We think RPA has great potential as a no-code paradigm for smaller processes.

    Why?
Not everything has an API and not everybody knows how to code.

But everybody knows how to point, click and type. Everybody understands data visualised in a UI.

We see axiom as a novel way to introduce the principles of programming to non-coders, to automate more day-to-day tasks.

     Why would you not use APIs? This is a waste of CPU cycles!
Most of our beta-users are non-technical, in roles like sales or e-commerce administration.

They have repetitive extract, transform, load (ETL) tasks primarily, and this gives them the opportunity to automate, where they could not before.

CPU cycles are still far cheaper than their time.

    Use-cases?
The primary use-case now is now ETL workflows in sales, e-commerce and customer support, but it’s a general tool.

Axiom’s beta is not for large-scale data-scraping or automated testing.

    Why should I trust you with my data?
We don’t touch your data. All data-processing and execution occurs client-side, on your machine.

We only store the code for execution. For data-storage, we use your Google drive/sheets account.

    Other browsers?
We currently only support chrome.

    Feedback
We’re interested in discovering new niches.

This is still a beta product. If you come across bugs, our support on https://axiom.ai is responsive.

Thanks!



Thanks. This is interesting! I would like to try it out. I'm doing scraping now, because our vendors refuse to give us API or they don't have one. They only give us data feeds via email. And the feeds are very sketchy, no images for example. We have to scrape their websites. My main tool is puppeteer, However, I found it's way easier to scrape XHR network responses, global javascript objects or look into <script type="application/json">, not from the UI.

In my case, I need to scrape 2 things. One is the sku number, which is often not displayed on the ui, but if you look into the source code of the webpage, you can usually find it.

I also need to scrape all product variants, such as all colors and product sizes, all finishes, etc.

If I do this via UI, I need to click each product option, and wait for the ui change and then scrape the content. My work becomes easier if I can capture a json data exchange or a javascript object that contains this information. For most of the cases, I can find it. So I gave up on reading things directly from the UI.

I learned a new word, "ETL workflow". I have been doing this data scraping thing for a while, I didn't know there is a whole ecosystem existing. Scraping is such a pain, it's sad that there is a lack of common data exchange standards.


Great, thanks! If you get stuck, e-mail ai@axiom.ai and well see if we can help.

It sounds like this is a possible use-case, but it depends on the scale at which you want to scrape.

Axiom is good at pulling out data from complex workflows with forms.

It's not designed for doing large-scale scraping (...IP rotation etc.)


This is interesting! Been working on the space too, but for email. I can see this as being interesting for speeding up QA and if we want to scrape data.

Do you have examples of the main uses cases we could try? Is this mainly made for engineers to automate prerecorded actions, or do you see another vertical apply?


Thanks!

Right now axiom isn't great for QA as we don't have conditional logic, but that inevitable feature will make QA pretty easy.

It's not good for scraping large-scale data-sets, but is good for data extraction from very complex forms.

The ideal use case is 'Extract, Transform, Load'. Most scrapers can only extract data, we can extract it, transform and input it into another system.

We've seen adoption for Sales and E-commerce admin in particular.


What would be the use case for our sales then? Say I wanna reach out to startups via email or have Hangouts calls and go through the sales funnel.


So, most sales use cases are for moving data into or out of a CRM, where the APIs are awkward or don't exist (e.g. LinkedIn -> HubSpot, SalesForce -> Sales Loft).

In your case, you could create a template for your emails and hangout invites, mail merge style. In addition to sending the hangout invite, you could trigger another action in your CRM.

I'm speculating as an example, it really depends on what particular processes you have that are repetitive.


It's pretty amazing for a beta product. Looks promising to me as far as I test it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: