Kofax Kapow: Features

Robotic Process Automation and Web Data Integration

Kapow Enterprise

Kofax Kapow is an enterprise-class robotic process automation and integration platform that is highly scalable, and flexible, providing all the robotic automation and intelligence capabilities an enterprise organization requires.

Robot Design and Deployment

The Kapow design studio includes design, deployment, QA and production support tools, including performance dashboards, a scheduler and viewers for data sources and targets.

Supports Multiple Data Sources and Types

Supports all types of application environments and data sources, including web sites, portals, enterprise systems, legacy applications, Excel, Email, XML, JSON, CSV, and SQL.

Robotic Synthetic APITM

Automatically publish robots with a standard Java, .NET, SOAP and RESTful interface which can be used to control robotic processes from external applications and remote systems.

Centralized Robot Deployment

Robots are deployed, managed, and executed from a centralized server, and communicate with applications running in a virtualized environment.

Read/Write Access to Data Endpoints

Enable read/write capabilities to Oracle, DB2, SQL Server, MySQL, PostgreSQL, Sybase, and NoSQL databases.

Business intelligence and Data Analytics

Delivers data to Kofax Insight, SAP Business Objects, Tableau, Qlik, and other business intelligence tools.

Flexible, Scalable Deployment

A stateless, multi-threaded architecture that easily scales deployments and can either be deployed as an on-premise enterprise deployment or provisioned in the cloud.

Extensive Security Controls

Role based access to the platform and connects with LDAP, Active Directory or built-in user management capabilities to secure access.

Operational Monitoring and Analytics

Kofax Analytics for Kapow delivers out-of-the-box dashboards focused on data integration robot operations and system performance.

Publish Robots as Consumable Apps to Users

Lightweight business applications called Kapow Kapplets can be designed to execute robots based on set parameters or present data back to a business user.

Auditing and Logging

Alerts and a complete historical record of every transaction. Robots authenticate against sites and web services using user ID/password, basic authentication, NTLM, OAuth or digital certificates.