BigBinary Blog

We write about Ruby on Rails, React.js, React Native, remote work, open source, engineering and design.

Rails 7 deprecates Enumerable#sum and Array#sum

This blog is part of our Rails 7 series.

Rails 7 deprecates Enumerable#sum to the calls with non-numeric arguments. To ignore the warning we should use a suitable initial argument.

Before Rails 7

1=> %w[foo bar].sum
2=> "foobar"
3
4=> [[1, 2], [3, 4, 5]].sum
5=> [1, 2, 3, 4, 5]

After Rails 7

1=> %w[foo bar].sum
2=> Rails 7.0 has deprecated Enumerable.sum in favor of Ruby's native implementation available since 2.4.
3   Sum of non-numeric elements requires an initial argument.
4
5=> [[1, 2], [3, 4, 5]].sum
6=> Rails 7.0 has deprecated Enumerable.sum in favor of Ruby's native implementation available since 2.4.
7   Sum of non-numeric elements requires an initial argument.

To avoid the deprecation warning, we should use suitable argument as below.

1=> %w[foo bar].sum('')
2=> "foobar"
3
4=> [[1, 2], [3, 4, 5]].sum([])
5=> [1, 2, 3, 4, 5]

Check out this pull request for more details.

Aashish Saini in Rails, Rails 7
June 22, 2021
Share

Rails 7 adds method calls for nested secrets

This blog is part of our Rails 7 series.

Rails stores secrets in config/credentials.yml.enc, which is encrypted and cannot be edited directly. You can read more about credentials management here: Rails security guide.

Rails 7 allows access to nested encrypted secrets (credentials) by method calls. We can easily access the nested secrets present in the credentials YAML file like we've accessed top-level secrets previously:

1# config/credentials.yml.enc
2
3secret_key_base: "47327396e32dc8ac825760bb31f079225c5c0"
4aws:
5  access_key_id: "A6AMOGVNQKCWLNQ"
6  secret_access_key: "jfm6b9530tPu/h8v93W4TkUJN+b/ZMKkG"
1=> Rails.application.credentials.aws
2=> {:access_key_id=>"A6AMOGVNQKCWLNQ", :secret_access_key=>"jfm6b9530tPu/h8v93W4TkUJN+b/ZMKkG"}

Before Rails 7

1=> Rails.application.credentials.aws[:access_key_id]
2=> "A6AMOGVNQKCWLNQ"
3
4=> Rails.application.credentials.aws.access_key_id
5=> NoMethodError (undefined method `access_key_id' for #<Hash:0x00007fb1adb0cca8>)

After Rails 7

1=> Rails.application.credentials.aws.access_key_id
2=> "A6AMOGVNQKCWLNQ"

Check out this pull request for more details.

Ashik Salman in Rails, Rails 7
June 9, 2021
Share

Using Cookies with Postgraphile

This blog details usage of cookies on a Postgraphile-based application. We will be using Postgraphile with Express for processing the cookies, but any similar library can be used.

Cookies can be a very safe method for storage on the client side. They can be set as:

  • HTTP only: cannot be accessed through client-side JavaScript, saving it from any third party client-side scripts or web extensions.
  • Secure: The web browser ensures that the cookies are set only on a secure channel.
  • Signed: We can sign the content to make sure it isn't changed on the client side.
  • Same Site: Make sure that the cookie is sent only if the site matches your domain/subdomain (details)

Prerequisites

  • Postgraphile - Generates an instant GraphQL API from a Postgres database
  • Express - Minimalistic backend framework for NodeJS

Setup

We will start off with a base Express setup generated with express-generator.

1const createError = require('http-errors');
2const express = require('express');
3const path = require('path');
4const cookieParser = require('cookie-parser');
5const logger = require('morgan');
6
7const app = express();
8
9require('dotenv').config();
10
11app.use(logger('dev'));
12app.use(express.json());
13app.use(express.urlencoded({ extended: false }));
14app.use(express.static(path.join(__dirname, 'public')));
15
16// Use secret key to sign the cookies on creation and parsing 
17app.use(cookieParser(process.env.SECRET_KEY));
18
19// Catch 404 and forward to error handler
20app.use(function(req, res, next) {
21  next(createError(404));
22});
23
24// Error handler
25app.use(function(err, req, res) {
26  // Set locals, only providing error in development
27  res.locals.message = err.message;
28  res.locals.error = req.app.get('env') === 'development' ? err : {};
29
30  // Render the error page
31  res.status(err.status || 500);
32  res.render('error');
33});
34
35module.exports = app;

From Postgraphile's usage library page for adding Postgraphile to an express app:

1app.use(
2  postgraphile(
3    process.env.DATABASE_URL || "postgres://user:pass@host:5432/dbname",
4    "public",
5    {
6      watchPg: true,
7      graphiql: true,
8      enhanceGraphiql: true,
9    }
10  )
11);

Now for the table setup. We need a private user_accounts table and a method named authenticate_user that will return a JWT token of the form:

1{
2  token: 'jwt_token_here',
3  username: '',
4  ...anyOtherDetails
5}

We will not be detailing table creation or authentication as there are many ways to go about it. But if you need help, Postgraphile security is the page to rely on.

Adding the Plugin library

To attach a cookie to the request, we will use the @graphile/operation-hooks library which is open-sourced on Github.

1npm install @graphile/operation-hooks
2# OR
3yarn add @graphile/operation-hooks

To add the library to the app:

1const { postgraphile, makePluginHook } = require("postgraphile");
2
3const pluginHook = makePluginHook([
4  require("@graphile/operation-hooks").default,
5  // Any more PostGraphile server plugins here
6]);
7
8app.use(
9  postgraphile(
10    process.env.DATABASE_URL || "postgres://user:pass@host:5432/dbname",
11    "public",
12    {
13      watchPg: true,
14      graphiql: true,
15      enhanceGraphiql: true,
16      pluginHook,
17      appendPlugins: [
18        // You will be adding the hooks here
19      ]
20    }
21  )
22);

Adding the Plugin

The plugin allows for two different types of hooks:

  1. SQL Hooks
  2. JavaScript Hooks

Since accessing cookies is a JavaScript operation, we will be concentrating on the second type.

To hook the plugin into the build system, we can use the addOperationHook method.

1module.exports = function OperationHookPlugin(builder) {
2  builder.hook("init", (_, build) => {
3    // Register our operation hook (passing it the build object):
4    // setAuthCookie is a function we will define later.
5    build.addOperationHook(useAuthCredentials(build));
6
7    // Graphile Engine hooks must always return their input or a derivative of
8    // it.
9    return _;
10  });
11};

If this is contained in a file named set-auth-cookie.js, then the plugin can be added to the append plugins array as follows:

1{
2  appendPlugins: [
3    require('./set-auth-cookie.js'),
4  ],
5}

Designing the hook

The function to be executed receives two arguments: build process and the current fieldContext.

The fieldContext consists of fields that can be used to narrow down the mutation or query that we want to target; e.g. if the hook is to run only on mutations, we can use the fieldContext.isRootMutation field.

1const useAuthCredentials = (build) => (fieldContext) => {
2  const { isRootMutation } = fieldContext;
3  if (!isRootMutation) {
4    // No hook added here
5    return null;
6  }
7}

To direct the system on usage of the plugin, we have to return an object with before, after or error fields. Here is how these keywords can be used:

(comments are from the example repository)

1return {
2    // An optional list of callbacks to call before the operation
3    before: [
4      // You may register more than one callback if you wish. They will be mixed in with the callbacks registered from other plugins and called in the order specified by their priority value.
5      {
6        // Priority is a number between 0 and 1000. If you're not sure where to put it, then 500 is a great starting point.
7        priority: 500,
8        // This function (which can be asynchronous) will be called before the operation. It will be passed a value that it must return verbatim. The only other valid return is `null` in which case an error will be thrown.
9        callback: logAttempt,
10      },
11    ],
12
13    // As `before`, except the callback is called after the operation and will be passed the result of the operation; you may return a derivative of the result.
14    after: [],
15
16    // As `before`; except the callback is called if an error occurs; it will be passed the error and must return either the error or a derivative of it.
17    error: [],
18  };

Since we want our action to happen after we get result from the mutation, we will add it to the after array.

1const useAuthCredentials = (build) => (fieldContext) => {
2  const { isRootMutation, pgFieldIntrospection } = fieldContext;
3  if (!isRootMutation) {
4    // No hook added here
5    return null;
6  }
7
8  if(!pgFieldIntrospection ||
9    // Name of the mutation is authenticateUser
10    pgFieldIntrospection.name !== "authenticateUser") {
11      // narrowing the scope down to the mutation we want
12      return null;
13  }
14
15  return {
16    before: [],
17    after: [
18      {
19        priority: 1000,
20        callback: (result, args, context) => {
21          // The result is here, so we can access accessToken and username. 
22          console.log(result);
23        }
24      }
25    ],
26    error: []
27  };
28}

Since the functionality is inside the plugin hook, we do not have the express result to set the cookie ๐Ÿ˜ž.

But we do have an escape hatch with the third argument: context. Postgraphile allows us to pass functions or values into the context variable from the postgraphile instance.

1app.use(
2  postgraphile(
3    process.env.DATABASE_URL,
4    "public",
5    {
6      async additionalGraphQLContextFromRequest(req, res) {
7        return {
8          // Function to set the cookie passed into the context object
9          setAuthCookie: function (authCreds) {
10            res.cookie('app_creds', authCreds, {
11              signed: true,
12              httpOnly: true,
13              secure: true,
14              // Check if you want to include SameSite cookies here, depending on your hosting.
15            });
16          },
17        };
18      }
19    }
20  )
21);

We can now set the cookie inside the plugin hook.

1{
2  priority: 1000,
3  callback: (result, args, context) => {
4    // This function is passed from additionalGraphQLContextFromRequest as detailed in the snippet above
5    context.setAuthCookie(result);
6  }
7}

Reading from the Cookie ๐Ÿช

We have already added the cookieParser with SECRET_KEY, so express will parse the cookies for us.

But we probably want them to be accessible inside SQL functions for Postgraphile. That is how we can determine if the user is signed in or what their permissions are. To do that, Postgraphile provides a pgSettings object.

1app.use(
2  postgraphile(
3    process.env.DATABASE_URL,
4    "public",
5    {
6      pgSettings: async req => ({
7        'user': req.signedCookies['app_creds'],
8      }),
9    },
10  )
11);

Inside an SQL function, the variables passed from settings can be accessed like this:

1current_setting('user')

That's all ๐ŸŽ‰. We can store any details in cookies, retrieve them on the Express end and use them inside Postgres functions for authentication or authorization.

Check out operation-hooks plugin for more details.

Agney Menon in PostGraphile
June 1, 2021
Share

Debug Node.js app running in a Docker container

A Docker container is a standard unit of software that packages up code and all its dependencies so the application runs quickly and reliably from one computing environment to another. One who have dealt with it would have wanted to debug their application like they do it normally, but it often feels difficult to configure. Let's do it in a simple way.

1. Install Docker Extension for VSCode

VSCode_Docker_Extension

This extension will be responsible for VSCode to debug an app inside the Docker container. It will also enable us to manage Docker images and containers.

2. Expose port 9229 in the docker-compose.yml

Port 9229 is the default node.js debugging port.

This will bind the port of the container with that of the host machine enabling the VSCode debugger to attach to the port.

1version: "3.9"
2services:
3  backend:
4    container_name: nodejs
5    restart: always
6    build:
7      context: .
8    ports:
9      - "80:3000"
10      - "5678:5678"
11      - "9229:9229"
12    command: yarn dev
OR

If you are directly running the app from command line, then you can append: -p 9229:9229 to the docker-run command. Example:

1docker run -d -p 80:3000 -p 9229:9229 node:15.0.1-alpine

3. Add the inspect switch to the npm script

1nodemon --inspect=0.0.0.0:9229 --watch server server/bin/www

Make sure you add the address and port to the inspect switch. When started with the --inspect switch, a node.js process listens for a debugging client. By default, it will listen to the host and port 127.0.0.1:9229. For Docker, we would have to update it to 0.0.0.0:9229.

4. Create a VSCode launch.json

You can generate a launch.json from the Debug tab of VSCode. Click on the drop down and select Add Configuration.

VSCode_Debugger

This action would generate a default JSON configuration. Replace the content with the below:

1{
2  "version": "0.2.0",
3  "configurations": [
4    {
5      "name": "Docker: Attach to Node",
6      "type": "node",
7      "request": "attach",
8      "restart": true,
9      "port": 9229,
10      "address": "localhost",
11      "localRoot": "${workspaceFolder}",
12      "remoteRoot": "/usr/src/app",
13      "protocol": "inspector"
14    }
15  ]
16}

5. Start the docker container and attach the debugger.

After your Docker container has been successfully launched, you can attach the debugger to it at any time by clicking the play button in the same Debug tab where we built the launch.json configuration.

VSCode will now adjust the color of its bottom status bar to indicate that it is in debugging mode, and you are ready to go.

You can put breakpoints anywhere in your file and get your work done faster and better than before.

Bonus Tip

Since the debugger has full access to the Node.js execution environment, a malicious agent who can bind to this port will be able to execute arbitrary code on the Node.js process's behalf. It is important to understand the security implications of exposing the debugger port on public and private networks. Make sure you are not exposing the debugging port in production environment.

Preveen Raj in Docker, JavaScript
May 25, 2021
Share

Helping Babel move to ES Modules

The Babel project recently moved to running builds on Node.js 14, which means that Babel can now use ES Modules (ESM) instead of CommonJS (CJS) to import/export modules for internal scripts.

Also, with the upcoming Babel 8.0.0 release, the team is aiming to ship Babel as native ES Modules. With this goal in mind, the team is shifting all CommonJS imports/exports to ESM ones. This is where I got the opportunity to contribute to Babel recently.

Why ES Modules though?

For a very long time, JS (or ECMAScript) did not have a standardized module import/export syntax. Various independent packages introduced formats to help work with modules in JS. Most browsers used the AMD API (Asynchronous Module Definition) implemented in the Require.js package, which had its own syntax and quirks.

CommonJS on the other hand was the standard used by Node.js, and it was no less quirky. Inconsistent formatting and poor interoperability between packages irked JS developers enough to demand a standard format.

Lately, the ECMAScript Standardization body (TC39) has adopted ESM (ECMAScript modules) as the standard format for Javascript. Most web browsers already support this format and Node.js 14 now provides stable support for it.

The task at hand

The next task was to convert all internal top-level scripts from using CommonJS to ESM. The finer details of the implementation, along with interoperability issues with non-ESM files, would trouble CommonJS for some time though.

The simplest of changes was to replace require() statements in each file with import statements. For example, files starting like:

1"use strict";
2
3const plumber = require("gulp-plumber");
4const through = require("through2");
5const chalk = require("chalk");

would be modified like here:

1import plumber from "gulp-plumber";
2import through from "through2";
3import chalk from "chalk";

to allow modules to be imported as ES modules.

In the above example also note that because ES modules are in strict mode by default, so "use strict"; declarations were removed from the beginning of these top-level scripts.

Almost all current NPM packages are CommonJS packages, exposing their functionalities using the module.exports syntax.

In case a file/package exports more than one value, we need to use named imports instead:

1import { chalk } from "chalk";

Where the default export object from a CommonJS module was named differently, it had to be aliased during import to avoid breaking pre-existing variables' names in the files being converted to ESM. For example,

1const rollupBabel = require("@rollup/plugin-babel").default;

had to be replaced with:

1import { babel as rollupBabel } from "@rollup/plugin-babel";

so we could keep using the variable rollupBabel in the file.

For instances where require() statements needed to be replaced by the dynamic import() statements

1const getPackageJson = (pkg) => require(join(packageDir, pkg, "package.json"));
2
3// replaced by
4const getPackageJson = (pkg) => import(join(packageDir, pkg, "package.json"));

the subsequent calls everywhere now needed to be awaited:

1   .forEach(id => {
2      const { name, description } = getPackageJson(id);
3   })
4
5   //await added
6   .forEach(id => {
7      const { name, description } = await getPackageJson(id);
8   })

Other things like importing JSON modules are currently only supported in CommonJS mode. Those imports were left as-is.

Blockers

With all the changes made and committed, we bumped into the next big roadblock: package dependencies. Babel uses Yarn 2 internally, and particularly the PnP feature of Yarn 2. Unfortunately, the ESM loader API was experimental at the time and not being used by PnP. The Babel and Yarn teams coordinated to implement it soon after.

Similarly, Jest has its own custom loader for ESM, which meant it could not support testing ESM modules with Babel. That issue was side-stepped for the time being.

Network effects

The good thing about the whole grind of shifting from CommonJS to ESM is that a lot of other major packages are also considering and implementing ESM support. The shift to ESM-only by Babel is already building confidence in others to do the same. Special thanks to the Babel maintainers for setting a great example and encouraging others to move to ESM.

Conclusion

All told, it was a great experience adding a new feature into a well-maintained and widely-used package. The biggest lesson from this has to be how changes made in Babel affect and influence other major packages, and how maintainers of various major open source packages work in tandem to avoid breaking each other's code. It is a very open and collaborative ecosystem with people discussing and working through github issues, comments, and even twitter threads.

Check out the pull request for more details.

Karan Sapolia Sharma in JavaScript
May 18, 2021
Share
Older
Newer

Subscribe to our newsletter