Friday, November 4, 2016

Quick and Dirty Web Service (to serve Parcels and Packages) from a Mac

The following command can be run from a Mac to allow service to run a simple Web server to allow connectivity to access the files located on /path/to/parcels.  Very useful for spinning up a quick CM/CDH repro cluster and needing to post a repo to host CM/CDH/Other parcels.  

$ cd /path/to/parcels; ifconfig | grep inet; sudo python -m SimpleHTTPServer 80

(Originally sourced from: http://lifehacker.com/start-a-simple-web-server-from-any-directory-on-your-ma-496425450, modified to fit this use-case)


The breakdown of the above command:
1) Change dir to the specified location to share/make available
2) Report all of the available network interfaces and display their IP address
3) Run the python package for SimpleHTTPServer on port 80.

Thursday, September 29, 2016

Quickly update user password in an automated fashion


 Helpful when updating a user password in a script for small test clusters:

$  ssh remoteserver 'useradd newuser;  $PASSWD_COMMAND'

where $PASSWD_COMMAND  is one of the following:

    echo -e "newpassword\newpassword" | passwd newuser

    echo "newuser:newpassword" | chpasswd


    echo "newpassword" | passwd --stdin newuser

Saturday, September 17, 2016

Install Mac OSX El Capitan to a USB stick via CLI


On the command line, copy and paste the following.  NOTE: Substitute the path of the actual USB stick in place of  /Volumes/:

sudo /Applications/Install\ OS\ X\ El\ Capitan.app/Contents/Resources/createinstallmedia --volume /Volumes/ --applicationpath /Applications/Install\ OS\ X\ El\ Capitan.app


Sourced from:
http://www.macworld.com/article/2981585/operating-systems/how-to-make-a-bootable-os-x-10-11-el-capitan-installer-drive.html




Friday, September 16, 2016

Install Cisco's Webex Plugin on the Opera Web Browser

Confirmed to install and run with the following versions of software:


  • Opera Web Browser 39.0.2256.71
  • Cisco Webex Extension Version 1.0.1 (Updated on 9/5/2014)
Dependencies:
  • Opera needs to have the following extension installed to allow installation of Chrome plugins:

Per the Webex support document for installing the Webex Extension, the following link is used to install the Webex extension:

https://chrome.google.com/webstore/detail/cisco-webexextension/jlhmfgmfgeifomenelglieieghnjghma?hl=en&authuser=1


Installation Steps:

  1. Install the CRX Extension Source Viewer in Opera
  2. Navigate to the Cisco Webex Extension plugin install page
  3. Click on the yellow CRX button  on the top of the Opera page 
  4. Click on Install 
  5. Opera will now navigate to the Opera Extensions page, and will prompt to install.   Click on the install button  and the Cisco Webex plugin should install.
  6. Once completed, the Cisco Webex Extension should now appear in the Opera Extensions page:



Tuesday, August 16, 2016

Sort a group of log4j-based log files by timestamp

Assuming that each log file consists of complete timestamps on the beginning and end (i.e. it's not abruptly truncated), here's a quick command to list first and last timestamps of a set of log4j log files, sorted in ascending order by first timestamp:
$ (for i in *.log.out*; do echo -n $i'\t'$(sed '1p;$!d' $i | cut -d ' ' -f-2 | tr '\n' '\t')'\n'; done) | sort -k2

To sort by last timestamp on each file, change sort argument from -k2 to -k4

Example:
$ (for i in *.log.out*; do echo -n $i'\t'$(sed '1p;$!d' $i | cut -d ' ' -f-2 | tr '\n' '\t')'\n'; done) | sort -k2
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.30 2016-08-12 13:09:30,297 2016-08-12 13:53:27,785
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.29 2016-08-12 13:53:27,791 2016-08-12 16:02:24,046
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.28 2016-08-12 16:02:24,051 2016-08-12 16:13:18,553
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.27 2016-08-12 16:13:18,555 2016-08-12 16:40:21,115
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.26 2016-08-12 16:40:21,123 2016-08-12 17:26:27,145
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.25 2016-08-12 17:26:27,153 2016-08-12 17:27:35,976
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.24 2016-08-12 17:27:37,404 2016-08-12 17:56:26,459
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.23 2016-08-12 17:56:26,463 2016-08-12 18:25:09,816
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.22 2016-08-12 18:25:09,822 2016-08-12 19:10:34,036
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.21 2016-08-12 19:10:34,081 2016-08-12 19:44:30,899
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.20 2016-08-12 19:44:30,996 2016-08-12 20:01:21,222
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.19 2016-08-12 20:01:21,363 2016-08-12 21:23:20,933
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.18 2016-08-12 21:23:21,183 2016-08-12 23:14:29,238
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.17 2016-08-12 23:14:29,429 2016-08-13 00:47:05,370
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.16 2016-08-13 00:47:05,376 2016-08-13 01:01:45,803
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.15 2016-08-13 01:01:45,808 2016-08-13 02:23:24,499
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.14 2016-08-13 02:23:24,499 2016-08-13 09:41:18,893
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.13 2016-08-13 09:41:18,898 2016-08-13 11:05:50,145
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.12 2016-08-13 11:05:50,149 2016-08-13 11:58:56,914
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.11 2016-08-13 11:58:56,919 2016-08-13 13:58:17,794
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.10 2016-08-13 13:58:17,800 2016-08-13 15:55:48,996
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.9 2016-08-13 15:55:49,001 2016-08-13 17:05:04,935
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.8 2016-08-13 17:05:04,939 2016-08-13 17:58:42,547
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.7 2016-08-13 17:58:42,552 2016-08-13 18:13:34,622
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.6 2016-08-13 18:13:34,627 2016-08-13 19:41:18,039
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.5 2016-08-13 19:41:18,045 2016-08-13 21:13:34,207
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.4 2016-08-13 21:13:34,209 2016-08-13 23:13:13,734
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.3 2016-08-13 23:13:13,737 2016-08-14 00:04:13,013
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.2 2016-08-14 00:04:13,017 2016-08-14 00:58:07,933
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out.1 2016-08-14 00:58:07,937 2016-08-14 01:43:06,945
hadoop-cmf-hdfs2-NAMENODE-namenode01.company.com.log.out 2016-08-14 01:43:07,107 2016-08-14 01:56:02,070

Thursday, May 19, 2016

Mac OSX (El Capitan) Software Update tool via CLI


List available software updates via CLI (but don't install them):

$ sudo softwareupdate -l
Software Update Tool
Copyright 2002-2015 Apple Inc.

Finding available software
Software Update found the following new or updated software:
   * OS X El Capitan Update-10.11.5
OS X El Capitan Update (10.11.5), 740450K [recommended] [restart]
   * RAWCameraUpdate6.19-6.19
Digital Camera RAW Compatibility Update (6.19), 7575K [recommended]
   * iTunesXPatch-12.4
iTunes (12.4), 144804K [recommended]



Install all pending software updates via CLI:

$ sudo softwareupdate -i -a
Software Update Tool
Copyright 2002-2015 Apple Inc.

Finding available software

Downloading OS X El Capitan Update
Downloading Digital Camera RAW Compatibility Update
Downloading iTunes
Downloaded Digital Camera RAW Compatibility Update
Downloaded iTunes
Downloaded OS X El Capitan Update
Installing OS X El Capitan Update, Digital Camera RAW Compatibility Update, iTunes
Done with OS X El Capitan Update
Done with Digital Camera RAW Compatibility Update
Done with iTunes
Done.

You have installed one or more updates that requires that you restart your

computer.  Please restart immediately.

Thursday, February 25, 2016

Useful settings for Mac OS X (Mavericks onward) / Adding GPX files directly to a Garmin Fenix 3

Here's a few settings that I came across [1] that helped make my daily workflow easier while working on a Mac, listed below:

defaults write com.apple.finder AppleShowAllFiles YES
defaults write com.apple.finder QLEnableTextSelection -bool TRUE
killall Finder

Let's cover what they do:

defaults write com.apple.finder AppleShowAllFiles YES

This command above enables the ability to show all hidden files in the Finder, which can be useful for power users (or for those who have a Fenix 3 and need to access the NEWFILES folder in /Volumes/Garmin/Garmin to upload .GPX files [2]).


Do you use the Quick look feature much in OS X?  If not, it's certainly a very useful feature to quickly peek into files on the Finder.  If you do use it and would like to copy and paste frequently from these files (i.e. source files, or general text files), the command below will enable this:

defaults write com.apple.finder QLEnableTextSelection -bool TRUE


Of course, we will need to either logout of the current session or restart the finder.  You can restart the finder by running the following:

killall Finder


In case you want to revert the above settings to default, you can run the following commands respectively:

defaults write com.apple.finder AppleShowAllFiles NO
defaults write com.apple.finder QLEnableTextSelection -bool FALSE
killall Finder


References / Sourced from:

[1] Macworld UK:
How to show hidden files and folders in Mac OS X Finder:

How to select and copy text from Quick Look previews in OS X:

[2] Itamar Latnik's Youtube post: Garmin Fenix 3 - Import a GPX Course

Tuesday, January 5, 2016

Calculate total amount of Physical RAM on all Active TaskTrackers in MR1 using Unix tools

The following should accomplish this assuming the interest is in identifying how much total physical RAM is available on all Active TaskTrackers (i.e. not blacklisted) in MR1 using common Unix commands:
for i in $(curl -s http://jobtracker.company.com:50030/machines.jsp?type=active | grep 'href="http://' | cut -d '=' -f2 | cut -d '"' -f2); do curl -s $i"jmx" | grep  'TotalPhysicalMemorySize' | grep -Po [0-9]+ ; done | paste -sd+ - | bc; 
NOTE:  Replace the http://jobtracker.company.com with the appropriate hostname of the JobTracker.
The above command will retrieve the list of active TaskTrackers from the JobTracker, iterate through the list, summing the 
TotalPhysicalMemorySize
 metric from each TaskTracker's JMX and display the end result, in bytes.

Example output:
[user@node10 ~]$ for i in $(curl -s http://node1.com:50030/machines.jsp?type=active | grep 'href="http://' | cut -d '=' -f2 | cut -d '"' -f2); do curl -s $i"jmx" | grep  'TotalPhysicalMemorySize' | grep -Po [0-9]+ ; done | paste -sd+ - | bc;

23488339968
[user@node10 ~]$



As replied here: https://www.quora.com/How-can-I-check-total-RAM-in-Hadoop-cluster-running-MR1-not-YARN