Setting Up WordPress in your own Server

Download and extract WordPress

wget https://wordpress.org/latest.tar.gz

tar -xzvf latest.tar.gz

Create a database and a user and link that to your WordPress

You can use either MySQL database or MariaDB as a backend for your WordPress. In this tutorial, I'll show using MySQL.

# . Install mysql

sudo apt-get install mysql-server

# Note: the root password can be found in /etc/mysql/debian.cnf in Ubuntu

Next, log into your mysql-server using mysql-client and create a database and a user for wordpress

mysql -u root -p root_password

mysql> CREATE DATABASE wordpressdb1;
mysql> GRANT ALL PRIVILEGES ON databasename.* TO "wordpressdb1user"@"%"
-> IDENTIFIED BY "wordpressdb1user_pass"; 

Now, link the database to wordpress by copying wp-config-sample.php to wp-config.php and editing it as follows:

define( 'DB_NAME', 'wordpressdb1' );

/** MySQL database username */
 define( 'DB_USER', 'wordpressdb1user' );

/** MySQL database password */
define( 'DB_PASSWORD', 'wordpressdb1user_pass' );

Install Php and Apache2

sudo apt-get install php libapache2-mod-php php-mcrypt php-mysql

sudo apt-get install apache2

Install WordPress

When you have apache2 server, the http files are served from /var/www/html/ folder. Move all the files and folder of wordpress to this folder and restart apache2 server by running the command sudo service apache2 restart

Now, point your browser to http://ip-address-of-your-machine/wp-admin/install.php and the wordpress installation will start.

Deep Dive Into Kafka HDFS Connect

Previously in this article, I wrote about Kafka Connect. Today, I'm going to get into the details of a type of Kafka Connect called Kafka HDFS Connect that usually comes pre-installed in the confluent distribution of Kafka. If not, it can be easily installed from the Confluent Hub by running the following command from the command line:

confluent-hub install confluentinc/kafka-connect-hdfs:latest

You can check all the connectors that are installed by:

confluent list connectors

As I said before, setting up a connector only involves writing a properties file and loading it to the Kafka. The properties that are available for Kafka HDFS Connect are here.

Below is properties file that I wrote that works to export JSON data from a topic in Kafka to a Kerberos Secured High Availability Hadoop cluster -

name=hdfs-sinkpageviews
connector.class=io.confluent.connect.hdfs.HdfsSinkConnector
tasks.max=1
topics=pageviewsjson
hdfs.url=hdfs://nameservice1
# this determines after how many messages in Kafka 
# to write to a file in HDFS
flush.size=3 
# for HA HDFS. Needs path to hadoop conf directory
hadoop.conf.dir=/confluent-5.2.1/config/hadoop-conf

# for secured hdfs
hdfs.authentication.kerberos=true
# in my case, _HOST was nothing. so it was just
# kerberosuser@REALM.COM
connect.hdfs.principal=kerberosuser/_HOST@REALM.COM
connect.hdfs.keytab=/path/to/keytabs/kerberosuser.keytab
hdfs.namenode.principal=hdfs/<HOST URL OF HDFS USER>@REALM.COM

# where to write files
topics.dir=/user/kerberosuser/topics
logs.dir=/user/kerberosuser/logs
format.class=io.confluent.connect.hdfs.json.JsonFormat

# worker config
key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=false