Removing DRM from B&N Books

Walkthru

This post owes much to Liberate Your (Legally Obtained) Nook Ebooks. The disclaimer there applies here equally. This process should only be done to move your books from the Nook app/ereader to another reader (e.g. iBook, Aquile, Kobo ereader, etc.). DO NOT ABUSE this process to pirate books.

There are steps in the link above that weren’t clear to me as I was following along. The following contents will elaborate on sections where I initially ran into problems.

Steps

These steps should work fine for both Windows and MacBooks.

The top-level steps are:

  • Set up a virtual device that can install the Nook app and allow me to log into my Nook account to see my books
  • Extracting the DRM-protected EPUB files for my books and store them on my local computer
  • Extracting the key to use to remove the DRM and save a copy on my local computer
  • Set up Calibre to open the DRM-protected EPUB files from my local computer and decrypting them using the extracted key

Setting up a virtual device

Install a copy of Android Studio. This can be done here: https://developer.android.com/studio

Once installed, running it will bring up a splash dialog. The “Android Virtual Device Manager” tool (AVD Manager) can be launched:

Once the Android Virtual Device Manager is up, clicking the “Create Virtual Device” will start the process to create a virtual device.

  • For the “device definition,” I select Phone > Nexus 4
  • For the “system image,” I select Nougat (Android 7.1.1)

IMPORTANT: More recent device definitions and/or system images may not work. Either the apps won’t install, or even if they are installed, I won’t be able to get to the files due to lack of root permissions (e.g. adb root does not work).

Once the virtual device is created, start it by double-clicking it or by using the “Actions” buttons:

Once the device is started, start the Chrome browser in the device, then install the Nook app by typing/pasting this into the address bar of the Chrome browser: https://apkpure.com/nook-read-ebooks-magazines/bn.ereader/versions

This brings up the download page of APKPure. Scroll down to find and click on the version 5.0.2.38. (Newer versions may not work.)

Clicking that box may bring up an ad that I can just close. Then I am asked whether to download the app or not.

Click DOWNLOAD.

Then I’m asked to Open the app. Click Open.

Then I am asked to Install. Click INSTALL.

The Nook app should then be started.

I sign in the Nook app as I normally do on a real device to get to my books.

All the books in my library are shown, but I need to click each book once to download it. I do this to all the books I want to download.

Extracting the DRM-protected EPUB files

Keep the virtual device running.

Find and run the ADB tool from Android Studio.

  • For Windows, the location should be %USERPROFILE%\AppData\Local\Android\Sdk\platform-tools
  • For Macs, the location should be ~/Library/Android/sdk/platform-tools

Find the EPUB files for the downloaded books (should be “/data/data/bn.ereader/files/B&N Downloads/Books“):

adb shell
generic_x86:/ # ls /data/data/bn.ereader/files/B\&N\ Downloads/Books

If there are more than a handful, it may be useful to zip them up before extracting them out of the virtual device:

generic_x86:/ # cd /data
generic_x86:/ # tar -czf epubs.tar.gz ./data/bn.ereader/files/B\&N\ Downloads/Books/*.epub

Exit the ADB shell.

To pulling an EPUB file individually:

adb pull /data/data/bn.ereader/files/B\&N\ Downloads/Books/xxxxxxxx.epub C:/temp/xxxxxxxx.epub

Or pulling the zipped up TAR file:

adb pull /data/epubs.tar.gz c:/temp/epubs.tar.gz

On Windows, *.tar.gz files can be opened by tools like 7-zip.

By now I should have a list of protected EPUB files. I can open them with EPUB readers, but they will be blank due to the DRM.

Extracting the key to use to remove the DRM

The key is embedded in an SQLite DB file inside the virtual device. Therefore, the ADB tool is used again. The location of the file should be /data/data/bn.ereader/databases/cchashdata.db:

adb pull /data/data/bn.ereader/databases/cchashdata.db C:/temp/cchashdata.db

Any SQLite browser (e.g. DB Browser for SQLite) can then be used to look into the cchashdata.db file.

Continuing with DB Browser for SQLite, start the tool, then open the pulled cchashdata.db file:

Open the cchashdata.db file pulled earlier

Once opened, use the “Browse Data” tab, then select the “cc_hash_data” table. There is only one record there, and its “hash” column will contain the key:

Copy the “hash” value

Create a new text file and place the copied hash value into it as the only line. Save the file as “bn_hash.b64” (use whatever name as desired, but use the “.b64” extension).

Set up Calibre to open the DRM-protected EPUB files

Download a ZIP (DeDRM_tools_x.y.z.zip) of the latest DeDRM_tools from here: https://github.com/apprenticeharper/DeDRM_tools/releases

Unzip the downloaded DeDRM_tools_x.y.z.zip (again, use 7-zip or similar tool for Windows) to extract the file DeDRM_plugin.zip. DO NOT unzip the DeDRM_plugin.zip file itself.

NOTE that DeDRM_plugin.zip is itself a zip file inside the DeDRM_tools_x.y.z zip.

Download a copy of the Calibre tool from here: https://calibre-ebook.com/download

Open and install the Calibre tool. Then start up Calibre.

From Calibre, click the “Preferences” icon on the top tool bar:

Scroll down the Preferences dialog and click on “Plugins“:

Then click “Load plugin from file” and open the file DeDRM_plugin.zip from above.

This will add a new entry “DeDRM (x.y.z)” under the “File type” section in the Plugins list.

Select the entry and click “Customize plugin.”

A small dialog will come up. Click “Barnes and Noble ebooks.”

Clicking “Barnes and Noble ebooks” will bring up another dialog.

Click “Import Existing Keyfiles” and select the bn_hash.b64 file (or whatever name used for the .b64 file) created earlier.

A new entry “bn_hash” (or whatever the name used for the .b64 file) is added.

Click Close. Then click “OK” on the “Customize DeDRM” dialog box. Then click “Apply” in the “Plugins” dialog box. Lastly, click “Close” on the Preferences dialog box.

Yes. That’s a total of FOUR dialog boxes.

Open and Save the EPUB files

With the plugin configured, I now can open the DRM-protected EPUB files.

I can then save the book. The copy that I save, however, will no longer have DRM protection. That copy can now be imported or opened by any EPUB reader.

Django I18N on Windows

django, i18n, programming

Concise run book on I18N to get things started

Prerequisite

Know the difference between locale name and language name:

  • Locale name: <language code>_<COUNTRY CODE> (examples: “en_US“, “fr_CA“), case sensitive.
  • Language name: <language subtag>-<range subtag> (examples: “en-us“, “fr-ca“), case insensitive.

They usually look alike EXCEPT locale names use the underscore (“_”) character to separate the language and country while the language names use the dash (“-“) to separate the subtags.

Creating a .po file for translation

  • Install the prerequisite gettext library. See gettext on Windows.
  • Create a “locale” subdirectory in the project’s base directory1.
  • Run python manage.py makemessages -l <locale name> [-l <locale name> ...] and provide locale names
    E.g.
    python manage.py makemessages -l en_US -l fr_CA

Translate each .po file

This can be done manually by editing the *.po file and fill in the msgstr values or by sending the file to a translation service.

Compile the translated .po file

  • Run python manage.py compilemessages
  • This will produce a .mo (e.g. django.mo) file next to the .po (e.g. django.po) file

Configure the Django settings file

Adding the LocaleMiddleware

Adding the middleware “django.middleware.locale.LocaleMiddleware” in the list MIDDLEWARE.

IMPORTANT: it must follow “django.contrib.sessions.middleware.SessionMiddleware” and precede “django.middleware.common.CommonMiddleware“.

For example:

MIDDLEWARE = [
    'django.middleware.security.SecurityMiddleware',
    'django.contrib.sessions.middleware.SessionMiddleware',
    'django.middleware.locale.LocaleMiddleware',  # Order is important
    'django.middleware.common.CommonMiddleware',
    'django.middleware.csrf.CsrfViewMiddleware',
    'django.contrib.auth.middleware.AuthenticationMiddleware',
    'django.contrib.messages.middleware.MessageMiddleware',
    'django.middleware.clickjacking.XFrameOptionsMiddleware',
]

Set the languages supported

Add/edit the setting LANGUAGES, listing all the languages supported. The value should be a list of tuples (language name, description). For example:

LANGUAGES = [
    ('en', 'English'),
    ('es-419', 'Spanish Latin America'),  # NOTE: language name, not locale name
]

Set the path(s) to locale subdirs

Add the path to the “locale” subdirectory where the *.mo files can be found. For example:

LOCALE_PATHS = [
    BASE_DIR / "locale",
]

  1. NOTE that it is also possible to create additionallocale” subdirs under individual apps’ subdirs, as long as they are configured in setting’s LOCALE_PATHS.

Loading Resources from Python Packages

programming, Python

Start with importlib_resources

Start with the package importlib_resources.

Properly export modules

In order for a resource to be accessible, the module (or most likely the submodule) containing it needs to be properly exported. By properly exported I mean to adding the submodule containing the resource inside the setup.py.

In this case, I wanted to make a JSON file (answers.json) accessible from a submodule (v8ball.van) of the package vans-eightball:

v8ball
  van
    answers.json
    ...

To export the submodule, the packages property in the setup.py file needs to include “v8ball.van” in order for the resource answers.json to be exported and accessible:

setup(
name="vans-eightball",
version="0.0.2",
...
packages=["v8ball.van", "v8ball"],
include_package_data=True,
...
)

Accessing the resource

An example of accessing the resource:

Install the package

pip install vans-eightball

Accessing the resource

import json
import v8ball.van
from importlib_resources import files
resource_path = files(v8ball.van).joinpath('answers.json')
data = json.loads(resource_path.read_text())

Python Package Publishing Notes

programming, Python

Foundations

For the most part, the doc How to Publish an Open-Source Python Package to PyPI – Real Python is what I followed. However, had that been it, I wouldn’t need to write this page.

Refinements

Installing twine by itself isn’t enough; wheel is also required:

pip install wheel
pip install twine

If wheel is not installed, I get this error when trying to build:

usage: setup.py [global_opts] cmd1 [cmd1_opts] [cmd2 [cmd2_opts] …]
or: setup.py --help [cmd1 cmd2 …]
or: setup.py --help-commands
or: setup.py cmd --help
error: invalid command 'bdist_wheel'

Test Publishing and Installing

To publish to the test repo:

twine upload --repository-url https://test.pypi.org/legacy/ dist/*

To test installing from the test repo:

pip install --index-url https://test.pypi.org/simple/ --extra-index-url https://pypi.org/simple/ <package name>==<version>

Postgresql and Mysql Docker containers

DB, docker, mysql, postgresql

To quickly get an instance of PostgreSQL and MySQL up and running, use the following docker-compose setup:

Create a subdirectory to place the docker-compose.yml and optionally the data files for the DBs:

Windows

cd %USERPROFILE%
mkdir dbs\data\mysql
mkdir dbs\data\psql
cd dbs

Others

cd ~
mkdir -p dbs/data/mysql
mkdir -p dbs/data/psql
cd dbs

Add this docker-compose.yml to start with:

version: '3.1'
services:
  mysql:
    image: mysql
    command: --default-authentication-plugin=mysql_native_password
    restart: always
    environment:
      MYSQL_ROOT_PASSWORD: password
    ports:
      - "3306:3306"
    expose:
      - "3306"
    volumes:
      - ./data/mysql:/var/lib/mysql
  postgresql:
    image: postgres
    environment:
      - POSTGRES_DB=postgres
      - POSTGRES_USER=postgres
      - POSTGRES_PASSWORD=password
    ports:
      - "5432:5432"
    expose:
      - "5432"
    volumes:
      - ./data/psql:/var/lib/postgresql/data

To bring them up:

docker-compose up

By default:

  • The PostgreSQL instance uses postgres / password as the admin.
  • The MySQL instance uses root / password for the admin.

Django Dev Env w/ Docker Compose

django, docker, postgresql, programming

A few years back I ran into a problem when working with Django on Windows while my colleagues were on Mac OS where a datetime routine (forgot which one) behaved differently between us. Even after syncing on the version of Python and Django between us, the discrepancy still existed. Turns out it’s due to the difference between Python on Windows vs. Python on Mac OS. We ended up working around it by not using that routine.

Thinking back now, I guess the problem could’ve been avoided if we used Docker or Vagrant or similar so that we at least are all on the same environment. It’s the type of thing that “real” work environments would’ve been. But since we were working on that project on our own as a hobby, we didn’t think too much about it.

ALSO: Docker Desktop or even Linux on Windows Home was not available at the time, so most likely I would’ve had to wrestle w/ Docker Toolbar and VirtualBox which still had problems with host volumes.

UPDATE: this post has been updated on 2022-05 based on new learnings.

Setting Up Environment in Docker

If I were to do it now, this is how I would do it:

  • Create a subdirectory for DB data. We were using PostgreSQL, so I would create something like C:\dbdata\ and use host volume to mount it to the container’s /var/lib/postgresql/data.
  • Use the postgres and python:3 base images from Docker Hub.

Step-by-step, here’s how I would set it up:

Project scaffold

NOTE: the following is using “myproject” as the name of the Django project. Replace it with the name of your Django project as appropriate.

cd dev/projects
mkdir dj

Create two starter versions of Dockerfile and docker-compose.yml:

Dockerfile

FROM python:3.7-buster
ENV PYTHONUNBUFFERED 1

WORKDIR /code
#COPY Pipfile Pipfile.lock /code/
#
RUN pip install pipenv
#RUN pipenv install

docker-compose.yml

version: '3'
services:
  app:
    build: .
#    command: >
#      sh -c "pipenv run python manage.py migrate &&
#             pipenv run python manage.py runserver 0.0.0.0:8000"
    ports:
      - "8000:8000"
    expose:
      - "8000"
    volumes:
      - ./:/code
    tty: true
    stdin_open: true

Then build and start up the containers:

docker-compose build
docker-compose run --rm app /bin/bash
pipenv install
pipenv install django 
pipenv install <other stuff as needed>

pipenv run django-admin startproject myproject .
pipenv run django-admin startapp myapp

Now uncomment the lines previously commented in Dockerfile and docker-compose.yml.

PostgreSQL Setup

Modify myapp/settings.py to use PostgreSQL:

...
DATABASES = {
    #'default': {
    #    'ENGINE': 'django.db.backends.sqlite3',
    #    'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
    #}
    'default': {
        'ENGINE': 'django.db.backends.postgresql',
        'NAME': 'postgres',
        'USER': 'postgres',
        'PASSWORD': 'postgres',
        'HOST': 'db',  # MUST match the service name for the DB
        'PORT': 5432,
    }
}
...

All pipenv-related operations should be done inside the container.

docker-compose run --rm app /bin/bash
pipenv install psycopg2-binary

Modify docker-compose.yml to bring up the DB and app containers:

version: '3'
services:
  # service name must match the HOST in myproject/settings.py's
  db:
    image: postgres
    environment:
      # Must match the values in myproject/settings.py's DATABASES
      - POSTGRES_DB=postgres
      - POSTGRES_USER=postgres
      - POSTGRES_PASSWORD=postgres
      # Put the DB data for myproject under myproject_db 
      # so that I can add more projects later
      - PGDATA=/var/lib/postgresql/data/myproject_db
    ports:
      - "5432:5432"
    expose:
      - "5432"
    volumes:
      # host volume where DB data are actually stored
      - c:/dbdata:/var/lib/postgresql/data
  app:
    build: .
    command: >
      sh -c "pipenv run python manage.py migrate &&
             pipenv run python manage.py runserver 0.0.0.0:8000"
    ports:
      - "8000:8000"
    expose:
      - "8000"
    volumes:
      - ./:/code
    depends_on:
      - db

The above:

  • sets up two “services” (containers): a “db” service for the DB in addition to the “app” service for the app.
  • sets up a host mount (for the “db” service) of c:\dbdata to the container’s /var/lib/postgresql/data where PostgreSQL stores/uses data for the DBs. This will allow the data to persist beyond the container’s life time.
  • sets up the PGPATH environment variable that specifies to PostgreSQL the data subdirectory to be /var/lib/postgresql/data/myproject_db which, because of the mount, will end up as c:\dbdata\myproject_db on my Windows host. This allows c:\dbdata to be used as a parent subdirectory for multiple project DBs.

Bring Up The Environment

Just run:

docker-compose up app --build

The above will:

  • Build the images and start the containers for the db and web services.
  • Initialize a new empty PostgreSQL database.
  • Run the Django migrations to prime the database for Django.
  • Run the app and have it listen on port 8000.

NOTE: there may be a race condition in the first run where the DB is still being build/initialize before the web service is starting.

This error happens in that case:

web_1 | psycopg2.OperationalError: could not connect to server: Connection refused
web_1 | Is the server running on host "db" (172.19.0.2) and accepting
web_1 | TCP/IP connections on port 5432?

Just wait until the “db_1” service is finished, hit CTRL-C, and run the

docker-compose up app --build

command again. It should now work fine.

Optionally, start up the “db” service first in the background, then start up the “web” service:

docker-compose up -d db
docker-compose up app

Docker Now Available for Windows Home

docker, Windows

For the longest time, Docker just didn’t like Windows Home. Legend has it that, with VirtualBox running a Linux VM, you then can install Docker Toolbox on top of that. I’ve tried that route. It works until I try to do volume mounts to the host file system. Somehow somewhere in the Docker > Virtualbox > Windows Home link something’s not right.

With the recent changes to Windows to add Linux support, Docker can now run on Windows Home. Unless there are other reasons (e.g. Remote Desktop), there is no need to upgrade to Windows Professional ($99 USD).

Install Linux Support

The process to install Docker on Windows Home is kinda long and spans multiple pages. Here is the summary:

  • Get the “Windows 10 May 2020 Update” or later by downloading and running the “Windows 10 Update Assistant”: https://www.microsoft.com/en-us/software-download/windows10 . NOTE: This will take a while.
  • Run PowerShell as Administrator (Windows-S; type “PowerShell”; click “Run as Administrator”)
  • Run in PowerShell:
    dism.exe /online /enable-feature /featurename:Microsoft-Windows-Subsystem-Linux /all /norestart
  • Restart Windows
  • Run PowerShell as Administrator (Windows-S; type “PowerShell”; click “Run as Administrator”)
  • Run in PowerShell:
    wsl --set-default-version 2
  • Open Microsoft Store and search for “Windows Subsystem for Linux” (WSL) (https://aka.ms/wslstore) and install a Linux distribution (e.g. “Ubuntu”)
  • Run the Linux distribution. You will be prompted to create a user account and set its password. See https://docs.microsoft.com/en-us/windows/wsl/install-win10#troubleshooting-installation for troubleshooting if necessary.

Install Docker for Windows

  • Go to https://hub.docker.com/editions/community/docker-ce-desktop-windows/ and download “Docker Desktop for Windows” (Don’t worry about the wording about Windows Professional). This downloads the “Docker Desktop Installer.exe” file that you then run.
  • Typically this will install into C:\Program Files\Docker\Docker. The runnable is “Docker Desktop.exe“.

References

https://hub.docker.com/editions/community/docker-ce-desktop-windows/

https://docs.microsoft.com/en-us/windows/wsl/install-win10