Help get this topic noticed by sharing it on Twitter, Facebook, or email.

Hadoop Fundamentals for Data Scientist Video: supergroup permission

Hadoop Fundamentals for Data Scientists: The supergroup group shown in the video doesn't appear to be active in the VM available from bit.ly. This is causing permission problems in running the examples. How can I enable this?
3 people have
this problem
+1
Reply
  • I can certainly look into this - do you mean the supergroup in HDFS or is this a Unix groups/users issue? If the issue is in HDFS, then please be sure you run the Hadoop daemons as the hadoop user before switching back to the student user.
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. sad, anxious, confused, frustrated kidding, amused, unsure, silly happy, confident, thankful, excited indifferent, undecided, unconcerned

  • When Jenny introduces the VM environment in video 5 she performs a groups command with lists the groups student, hadoop, etc. and supergroup. I don't see the supergroup when I perform this action in the VM I downloaded from bit.ly. So I'm guessing it is a unix groups/user issue. The problem I get is when I run the wordcount mapreduce example. I get a no write permission error when the systems attempts to create a temp directory. I have started the dfs, yarn, daemons, etc. I guess I need to change permissions, but I'm not sure how.

    Thank you for your help. Great videos by the way.

    \ jon
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. sad, anxious, confused, frustrated kidding, amused, unsure, silly happy, confident, thankful, excited indifferent, undecided, unconcerned

  • Hmm, it sounds like you might need to format the namenode - have you already done this?

    No matter what, the supergroup shouldn't be affecting this - the only groups to be concerned with are the hadoop group and the student group.

    Where is the temp directory getting created?

    In an ironic twist - I have lost access to the videos (I was working with the production copies, which are out of production and on sale- and I don't own the videos in my O'Reilly account). I'm attempting to get access so I can see what you're referring to and respond (which is why it's taking me so long to get back to you!).
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. sad, anxious, confused, frustrated kidding, amused, unsure, silly happy, confident, thankful, excited indifferent, undecided, unconcerned

  • Hi,

    I formated the hdfs file system with

    hdfs namenode -format

    Now when I try to create the /user/student directory I get the error:

    student@hadoop:~$ hadoop fs -mkdir -p /user/student
    mkdir: Permission denied : user=student, access=WRITE, inode="/" :hadoop:supergroup:drwxr-xr-x

    I guess I destroyed some permissions when I formated the hdfs. I'm guessing it might be better if I just built my hadoop environment from scratch - following your instructions - to get a better handle on what I'm doing. My interest in hadoop is basically to have an environment so that I can play around with mapreduce algorithms for optimization.

    Thank you for your help. There is no immediate hurry.
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. sad, anxious, confused, frustrated kidding, amused, unsure, silly happy, confident, thankful, excited indifferent, undecided, unconcerned

  • That's actually an easy problem to solve - just su hadoop (to the hadoop user) then create the student directory. Once you've done that the student user should have permissions and access.

    Here the supergroup is a little different as well - this is an HDFS group, not a local Unix group, which is why I was asking from before what temp file was being created.

    Basically in our scheme we've made the "hadoop" user the super user and the "student" user the one working on it - because this is fairly common in production. Another option if you're just looking to have an environment to play around in is to use simply use Spark and your local file system or to use the Cloudera or Hortonworks VMs.

    Of course, if you're interested in some of the operations aspects, then setting it up from scratch will teach you a lot!
  • (some HTML allowed)
    How does this make you feel?
    Add Image
    I'm

    e.g. sad, anxious, confused, frustrated kidding, amused, unsure, silly happy, confident, thankful, excited indifferent, undecided, unconcerned