Using VectorH with Hadoop Security Systems
Requirements
When properly configured, VectorH works on a Hadoop system with Apache Knox and Apache Ranger services enabled; it does not, however, fully integrate with these services. VectorH has its own infrastructures for both the gateway and access control functionalities provided by these services.
Requirements for using VectorH with Hadoop security systems are as follows:
• Hadoop Cluster (tested with Hortonworks HDP 2.3.4 and Centos 7)
– One cluster machine defined as Gateway (default is HDFS NameNode) with the following installed:
• Apache Knox using Ambari – for indirect user authentication
• Apache Ranger using Ambari – for HDFS access control
• Vector Client Runtime using command line (with Name Server (GCN), Communications Server (GCC), and Data Access Server (GCD)
– VectorH installation (on HDFS DataNodes)
• Client machines using at least JRE 1.7 or higher with JDBC driver iijdbc.jar
• Machines where LDAPv3 Server and Kerberos Server are installed