fix(guide): simplify directory structure

This commit is contained in:
Mrugesh Mohapatra
2018-10-16 21:26:13 +05:30
parent f989c28c52
commit da0df12ab7
35752 changed files with 0 additions and 317652 deletions

View File

@ -0,0 +1,89 @@
---
title: Nodejs- Buffer
---
## Buffer
Binary is simply a set or a collection of `1` and `0`. Each number in a binary, each 1 and 0 in a set are called a _bit_. Computer converts the data to this binary format to store and perform operations. For example, the following are five different binaries:
`10, 01, 001, 1110, 00101011`
JavaScript does not have a byte type data in its core API. To handle binary data Node.js includes a binary buffer implementation with a global module called `Buffer`.
### Creating a Buffer
There are different ways you can create a buffer in Node.js. You can create an empty buffer by with a size of 10 bytes.
```javascript
const buf1 = Buffer.alloc(10);
```
From UTF-8-encoded strings, the creation is like this:
```javascript
const buf2 = Buffer.from('Hello World!');
```
There are different accepted encoding when creating a Buffer:
* ascii
* utf-8
* base64:
* latin1
* binary
* hex
There are three separate functions allocated in the Buffer API to use and create new buffers. In above examples we have seen `alloc()` and `from()`. The third one is `allocUnsafe()`.
```javascript
const buf3 = Buffer.allocUnsafe(10);
```
When returned, this function might contain old data that needs to be overwritten.
### Interactions with Buffer
There are different interactions that can be made with the Buffer API. We are going to cover most of them here. Let us start with converting a buffer to JSON.
```javascript
let bufferOne = Buffer.from('This is a buffer example.');
console.log(bufferOne);
// Output: <Buffer 54 68 69 73 20 69 73 20 61 20 62 75 66 66 65 72 20 65 78 61 6d 70 6c 65 2e>
let json = JSON.stringify(bufferOne);
console.log(json);
// Output: {"type": "Buffer", "data": [84,104,105,115,32,105,115,32,97,32,98,117,102,102,101,114,32,101,120,97,109,112,108,101,46]}
```
The JSON specifies that the type of object being transformed is a Buffer, and its data. Converting an empty buffer to JSON will show us that it contains nothing but zeros.
```javascript
const emptyBuf = Buffer.alloc(10);
emptyBuf.toJSON();
// Output: { "type": "Buffer", "data": [ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 ] }
```
Do notice that, Buffer API also provides a direct function `toJSON()` to convert a buffer into a JSON object. To examine the size of a buffer, we can use `length` method.
```javascript
emptyBuf.length;
// Output: 10
```
Now let us convert buffer to a readable string, in our case, the utf-8 encoded.
```javascript
console.log(bufferOne.toString('utf8'));
// Output: This is a buffer example.
```
`.toString()` by default converts a buffer to a utf-8 format string. This is how you decode a buffer. If you specify an encoding you can convert the buffer to another encoding
```javascript
console.log(bufferOne.toString('base64'));
```

View File

@ -0,0 +1,183 @@
---
title: Event Emitters
---
## Events and Streams
Traditionally, in a web server, to process a data in the form of a file by reading and writing consumes a lot more memory since these processing methods need to load the file every time it has to read or write that file. This is considered a waste of resources. Think of it, in terms of scalability and Big Data if we are wasting resources, we are going to compromise a lot. Node.js asynchronous nature provides two suitable options for us to work with and to provide a flow of data that consumes fewer resources since Node.js is based on a non-blocking asynchronous model. They are emitting events and streams. In this section, we will be taking a look at both of them.
## EventEmitter Class
EventEmitters are one of the core ideas behind the architecture of event-driven programming or asynchronous programming in Node.js. An EventEmitter is an object and in Node.js any object that emits an event is an instance of EventEmitter class. What is an event? Every action took by the Node.js program such as initiating the web server and closing terminating the program when there is no request left to fulfill are considered as two separate events.
To access the EventEmitter class in a Node.js program, you have to require the `events` module from the Node.js API. To create the object, we then make an instance of the EventEmitter class by calling its constructor function.
```js
const events = require('events');
const eventEmitter = new events.EventEmitter();
```
Or you can directly require access the EventEmitter subclass like this:
```js
const EventEmitter = require('events');
const eventEmitter = new EventEmitter();
```
An EventEmitter class provides various pre-defined methods to work with events. These methods are `.on`, `.emit` and `.error`. Emitting an event from a function can be done triggering a callback function that can be accessed by any other JavaScript function. This is like broadcasting.
The ability to trigger an event can be done by following syntax:
```js
eventEmitter.emit(eventName, optionalData);
```
And the ability to attach a listener function and define the name of a specific event is done by `.on`.
```js
eventEmitter.emit(eventName, callback);
```
We will mimic the new functions we just learned about with an example. Create a new file called `eventemitter.js` and paste the following code:
```js
const EventEmitter = require('events');
const eventEmitter = new EventEmitter();
const callback = () => {
console.log('callback runs');
};
eventEmitter.on('invoke', callback);
eventEmitter.emit('invoke');
eventEmitter.emit('invoke');
```
Now run the above example using `node` command and you must get the following output.
```shell
callback runs
callback runs
```
We start by creating an eventEmitter instance through which we get access to `.on` the method. The `.on` method adds the event by defining the name `invoke` which we use later on in `.emit` to call trigger the callback function associated with it.
There is another function that EventEmitter class provides which is called `.once`. This method only invokes the callback function associated with an event first time when an event is emitted. Consider the example below.
```js
const EventEmitter = require('events');
const eventEmitter = new EventEmitter();
const callback = () => {
console.log('callback runs');
};
eventEmitter.once('invoke', callback);
eventEmitter.emit('invoke');
eventEmitter.emit('invoke');
```
Output
```shell
callback runs
```
`.once` keep tracks of events as to when they get triggered and how many times they get triggered unlike `.on` method which does not keep track like this. This is a major difference between the two.
## Understanding Streams
Node.js provide another way to work with data instead of consuming a high amount of memory resources and making it cost effective. This what streams do. Basically, streams let you read data from the one source and put into the destination. Streams process data in chunks instead of all at once thus, making them suitable for working with large datasets. Many built-in Node.js modules use streams under the hood. For example, HTTP request and response, TCP sockets, zlib, crypto, fs read and write streams, etc.
### Type of Streams
In Node.js there are four types of streams:
- Readable
- Writable
- Duplex
- Transform
Most common of these are Readable and Writable streams. Readable streams are used to read the data from the source and Writable streams are used to perform the write operation of that data to the destination. Duplex streams can be used to perform both read and write operations. Transform is a superset of Duplex streams with the only difference is that in this the data can be modified when reading or being written.
### Type of Stream Events
All of these stream types are instances of EventEmitter class which does mean that they emit read and write events. Each type of stream can emit the following events.
- data: This event is fired when the data is available to read by the readable stream
- error: This event is fired when there is an error reading or writing the data
- end: This event gets fired when there is no data left to read
## Readable Streams
A readable stream lets you read the data from the source. This source can be anything from a buffer, a file, etc. First, create a file simple text file from which we are going to read the data using the stream.
```text
I am Text file that contains data.
```
Now, create a new file called read.js which will implement the functionality of reading data from this text file using a readable stream.
```js
const fs = require('fs');
const readableStream = fs.createReadStream('abc.txt');
let data = '';
readableStream.on('data', function(chunk) {
data += chunk;
});
readableStream.on('end', function() {
console.log(data);
});
```
If you run the above program, you will get the following output:
```shell
$ node test.js
I am Text file that contains data.
```
Which is what we wrote inside the text file. To read data using stream we use a function called `createReadStream()` of file system module `fs`.
When there is no data left to read by the readable stream, it automatically ends the callback function. The `.on` method is what we learned in the previous section of EventEmitter class. This signifies that streams use EventEmitter class behind the scenes.
## Writable Stream
Writeable streams are used to write or insert or append data to a destination. Like readable streams, they are also provided by `fs` module. Create a new file called `wrtte.js` in which are going to use a readable stream to read data from the source and write it to a destination by creating a new `.txt` file.
```js
const fs = require('fs');
const readableStream = fs.createReadStream('./abc.txt');
const writeableStream = fs.createWriteStream('./dest.txt');
let data = '';
readableStream.on('data', function(chunk) {
writeableStream.write(chunk);
});
```
When you run this program, a new file will be created by the writable stream since it has access to the file system module. Writable stream uses `.write()` method to output the data on the destination. In the above example, we are creating a new file called `dest.txt` that will contain the same data as of `abc.txt`.
## Piping
Piping is a mechanism by which you can read data from the source and write it to the destination without writing so much code like we did above and it does not use `.on` or `.write` methods.
If we are to write above example using pipe, we will write like below:
```js
const fs = require('fs');
const readableStream = fs.createReadStream('./abc.txt');
const writeableStream = fs.createWriteStream('./dest.txt');
readableStream.pipe(writeableStream);
```
Notice how many lines of code we have removed. Also, we now only need the source and destination file paths and to read and write data we are using `.pipe()` method.

View File

@ -0,0 +1,527 @@
---
title: ExpressJS
---
## ExpressJS
When it comes to build web applications using Node.js, creating a server can take a lot of time. Over the years Node.js has matured enough due to the support from community. Using Node.js as a backend for web applications and websites help the developers to start working on their application or product quickly. In this tutorial, we are going to look into Expressjs which is a Node.js framework for web development that comes with features like routing and rendering and support for REST APIs.
## What is Express?
Express is the most popular Node.js framework because it requires minimum setup to start an application or an API and is fast, and unopinionated at the same time. In other words, it does not enforces its own philosophy that a application or API should be built in a specific way, unlike Rails and Django. Its flexibility can be calculated by the number of `npm` modules available which makes it pluggable at the same time. If you have basic knowledge of HTML, CSS, and JavaScript and how Node.js works in general, in no time you will be able to get started with Expressjs.
Express was developed by TJ Holowaychuk and is now maintained by Node.js foundation and open source developers. To get started with the development using Express, you need to have Node.js and npm installed. You can install [Node.js](https://nodejs.org/en/) on your local machine and along with it comes the command line utility `npm` that will help us to install plugins or as called dependencies later on in our project.
To check if everything is installed correctly, please open your terminal and type:
```shell
node --version
v5.0.0
npm --version
3.5.2
```
If you are getting the version number instead of an error that means you have installed Node.js and npm successfully.
## Why use Expressjs?
Before we start with mechanism of using Express as the backend framework, let us first explore why we should consider it using or the reasons of its popularity.
* Express lets you build single page, multi-page, and hybrid web and mobile applications. Other common backend use is to provide an API for a client (whether web or mobile).
* It comes with a default template engine, Jade which helps to facilitate the flow of data into a website structure and does support other template engines.
* It supports MVC (Model-View-Controller), a very common architecture to design web applications.
* It is cross-platform and is not limited to any particular operating system.
* It leverages upon Node.js single threaded and asynchronous model.
Whenever we create a project using `npm`, our project must have a `package.json` file.
### Creating package.json
A JSON (JavaScript Object Notation) file is contains every information about any Express project. The number of modules installed, the name of the project, the version, and other meta information. To add Expressjs as a module in our project, first we need to create a project directory and then create a package.json file.
```shell
mkdir express-app-example
cd express-app-example
npm init --yes
```
This will generate a `package.json` file in the root of the project directory. To install any module from `npm` we need to have `package.json` file exist in that directory.
```json
{
"name": "express-web-app",
"version": "0.1.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [],
"license": "MIT"
}
```
### Installing Express
Now we have `package.json` file, we can install Express by running the command:
```shell
npm install --save express
```
We can confirm that Express has correctly installed by two ways. First, there will be new section in `package.json` file named `dependencies` under which our Express exists:
```json
{
"name": "express-web-app",
"version": "0.1.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [],
"license": "MIT",
"dependencies": {
"express": "4.16.0"
}
}
```
Second way is that a new folder called `node_modules` suddenly appeared in the root of our project directory. This folder stores the packages we install locally in our project.
## Building a Server with Express
To use our installed package for Express framework and create a simple server application, we will create the file, `index.js`, at the root of our project's directory.
```javascript
const express = require('express');
const app = express();
app.get('/', (req, res) => res.send('Hello World!'));
app.listen(3000, () => console.log('Example app listening on port 3000!'));
```
To start the server, go to your terminal and type:
```shell
node index.js
```
This will start the server. This bare-minimum application will listen on port 3000. We make a request through our browser on `http://localhost:3000` and our server will respond with `Hello World` to which the browser is the client and the message will be shown there.
The first line of our code is using the `require` function to include the `express` module. This is how we include and use a package installed from npm in any JavaScript file in our project. Before we start using Express, we need to define an instance of it which handles the request and response from the server to the client. In our case, it is the variable `app`.
`app.get()` is a function that tells the server what to do when a `get` request at the given route is called. It has a callback function `(req, res)` that listen to the incoming request `req` object and respond accordingly using `res` response object. Both `req` and `res` are made available to us by the Express framework.
The `req` object represents the HTTP request and has properties for the request query string, parameters, body, and HTTP headers. The res object represents the HTTP response that an Express app sends when it gets an HTTP request. In our case, we are sending a text `Hello World` whenever a request is made to the route `/`.
Lastly, `app.listen()` is the function that starts a port and host, in our case the `localhost` for the connections to listen to incoming requests from a client. We can define the port number such as `3000`.
## Anatomy of an Express Application
A typical structure of an Express server file will most likely contain the following parts:
**Dependencies**
Importing the dependencies such as the express itself. These dependencies are installed using `npm` like we did in the previous example.
**Instantiations**
These are the statements to create an object. To use express, we have to instantiate the `app` variable from it.
**Configurations**
These statements are the custom application based settings that are defined after the instantiations or defined in a separate file (more on this when discuss the project structure) and required in our main server file.
**Middleware**
These functions determine the flow of request-response cycle. They are executred after every incoming request. We can also define custom middleware functions. We have section on them below.
**Routes**
They are the endpoints defined in our server that helps to perform operations for a particular client request.
**Bootstrapping Server**
The last that gets executed in an Express server is the `app.listen()` function which starts our server.
We will now start disussing sections that we haven't previously discussed about.
## Routing
Routing refers to how an server side application responds to a client request to a particular endpoint. This endpoint consists of a URI (a path such as `/` or `/books`) and an HTTP method such as GET, POST, PUT, DELETE, etc.
Routes can be either good old web pages or REST API endpoints. In both cases the syntax is similar syntax for a route can be defined as:
```javascript
app.METHOD(PATH, HANDLER);
```
Routers are helpful in separating concerns such as different endpoints and keep relevant portions of the source code together. They help in building maintainable code. All routes are defined before the function call of `app.listen()`. In a typical Express application, `app.listen()` will be last function to execute.
### Routing Methods
HTTP is a standard protocol for a client and a server to communicate over. It provides different methods for a client to make request. Each route has at least on hanlder function or a callback. This callback function determines what will be the response from server for that particular route. For example, a route of `app.get()` is used to handle GET requests and in return send simple message as a response.
```javascript
// GET method route
app.get('/', (req, res) => res.send('Hello World!'));
```
### Routing Paths
A routing path is a combination of a request method to define the endpoints at which requests can be made by a client. Route paths can be strings, string patterns, or regular expressions.
Let us define two more endpoints in our server based application.
```javascript
app.get('/home', (req, res) => {
res.send('Home Page');
});
app.get('/about', (req, res) => {
res.send('About');
});
```
Consider the above code as a bare minimum website which has two endpoints, `/home` and `/about`. If a client makes a request for home page, it will only response with `Home Page` and on `/about` it will send the response: `About Page`. We are using the `res.send` function to send the string back to the client if any one of the two routes defined is selected.
### Routing Parameters
Route parameters are named URL segments that are used to capture the values specified at their position in the URL. `req.params` object is used in this case because it has access to all the parameters passed in the url.
```javascript
app.get('/books/:bookId', (req, res) => {
res.send(req.params);
});
```
The request URL from client in above source code will be `http://localhost:3000/books/23`. The name of route parameters must be made up of characters ([A-Za-z0-9_]). A very general use case of a routing parameter in our application is to have 404 route.
```javascript
// For invalid routes
app.get('*', (req, res) => {
res.send('404! This is an invalid URL.');
});
```
If we now start the server from command line using `node index.js` and try visiting the URL: `http://localhost:3000/abcd`. In response, we will get the 404 message.
## Middleware Functions
Middleware functions are those functions that have access to the request object (`req`), the response object (`res`), and the `next` function in the applications request-response cycle. The objective of these functions is to modify request and response objects for tasks like parsing request bodies, adding response headers, make other changes to request-response cycle, end the request-response cycle and call the next middleware function.
The `next` function is a function in the Express router which is used to execute the other middleware functions succeeding the current middleware. If a middleware function does include `next()` that means the request-response cycle is ended there. The name of the function `next()` here is totally arbitary and you can name it whatever you like but is important to stick to best practices and try to follow a few conventions, especially if you are working with other developers.
Also, when writing a custom middleware do not forget to add `next()` function to it. If you do not mention `next()` the request-response cycle will hang in middle of nowhere and you servr might cause the client to time out.
Let use create a custom middleware function to grasp the understanding of this concept. Take this code for example:
```javascript
const express = require('express');
const app = express();
// Simple request time logger
app.use((req, res, next) => {
console.log("A new request received at " + Date.now());
// This function call tells that more processing is
// required for the current request and is in the next middleware
function/route handler.
next();
});
app.get('/home', (req, res) => {
res.send('Home Page');
});
app.get('/about', (req, res) => {
res.send('About Page');
});
app.listen(3000, () => console.log('Example app listening on port 3000!'));
```
To setup any middleware, whether a custom or available as an npm module, we use `app.use()` function. It as one optional parameter path and one mandatory parameter callback. In our case, we are not using the optional paramaeter path.
```javascript
app.use((req, res, next) => {
console.log('A new request received at ' + Date.now());
next();
});
```
The above middleware function is called for every request made by the client. When running the server you will notice, for the every browser request on the endpoint `/`, you will be prompt with a message in your terminal:
```shell
A new request received at 1467267512545
```
Middleware functions can be used for a specific route. See the example below:
```javascript
const express = require('express');
const app = express();
//Simple request time logger for a specific route
app.use('/home', (req, res, next) => {
console.log('A new request received at ' + Date.now());
next();
});
app.get('/home', (req, res) => {
res.send('Home Page');
});
app.get('/about', (req, res) => {
res.send('About Page');
});
app.listen(3000, () => console.log('Example app listening on port 3000!'));
```
This time, you will only see a similar prompt when the client request the endpoint `/home` since the route is mentioned in `app.use()`. Nothing will be shown in the terminal when the client requests endpoint `/about`.
Order of middleware functions is important since they define when to call which middleware function. In our above example, if we define the route `app.get('/home')...` before the middleware `app.use('/home')...`, the middleware function will not be invoked.
### Third Party Middleware Functions
Middleware functions are useful pattern that allows developers to reuse code within their applications and even share it with others in the form of NPM modules. The essential definition of middleware is a function with three arguments: request (or req), response (res), and next which we observer in the previous section.
Often in our Express based server application, we will be using third party middleware functions. These functions are provided by Express itself. They are like plugins that can be installed using npm and this is why Express is flexible.
Some of the most commonly used middleware functions in an Express appication are:
#### bodyParser
It allows developers to process incoming data, such as body payload. The payload is just the data we are receiving from the client to be processed on. Most useful with POST methods. It is installed using:
```shell
npm install --save body-parser
```
Usage:
```javascript
const bodyParser = require('body-parser');
// To parse URL encoded data
app.use(bodyParser.urlencoded({ extended: false }));
// To parse json data
app.use(bodyParser.json());
```
It is probably one of the most used third-party middleware function in any Express applicaiton.
#### cookieParser
It parses Cookie header and populate `req.cookies` with an object keyed by cookie names. To install it,
```shell
$ npm install --save cookie-parser
```
```javascript
const cookieParser = require('cookie-parser');
app.use(cookieParser());
```
#### session
This middleware function creates a session middleware with given options. A session is often used in applications such as login/signup.
```shell
$ npm install --save session
```
```javascript
app.use(
session({
secret: 'arbitary-string',
resave: false,
saveUninitialized: true,
cookie: { secure: true }
})
);
```
### morgan
The morgan middleware keeps track of all the requests and other important information depending on the output format specified.
```shell
npm install --save morgan
```
```javascript
const logger = require('morgan');
// ... Configurations
app.use(logger('common'));
```
`common` is a predfined format case which you can use in the application. There are other predefined formats such as tiny and dev, but you can define you own custom format too using the string parameters that are available to us by morgan.
A list of most used middleware functions is available at this [link](https://expressjs.com/en/resources/middleware.html).
## Serving Static Files
To serve static files such as CSS stylesheets, images, etc. Express provides a built in middleware function `express.static`. Static files are those files that a client downloads from a server.
It is the only middleware function that comes with Express framework and we can use it directly in our application. All other middlewares are third party.
By default, Express does not allow to serve static files. We have to use this middleware function. A common practice in the development of a web application is to store all static files under the 'public' directory in the root of a project. We can serve this folder to serve static files include by writing in our `index.js` file:
```javascript
app.use(express.static('public'));
```
Now, the static files in our public directory will be loaded.
```shell
http://localhost:3000/css/style.css
http://localhost:3000/images/logo.png
http://localhost:3000/images/bg.png
http://localhost:3000/index.html
```
### Multiple Static Directories
To use multiple static assets directories, call the `express.static` middleware function multiple times:
```javascript
app.use(express.static('public'));
app.use(express.static('files'));
```
### Virtual Path Prefix
A fix path prefix can also be provided as the first argument to the `express.static` middleware function. This is known as a _Virtual Path Prefix_ since the actual path does not exist in project.
```javascript
app.use('/static', express.static('public'));
```
If we now try to load the files:
```shell
http://localhost:3000/static/css/style.css
http://localhost:3000/static/images/logo.png
http://localhost:3000/static/images/bg.png
http://localhost:3000/static/index.html
```
This technique comes in handy when providing multiple directories to serve static files. The prefixes are used to help distinguish between the multiple directories.
## Template Engines
Template engines are libraries that allow us to use different template languages. A template language is a special set of instructions (syntax and control structures) that instructs the engine how to process data. Using a template engine is easy with Express. The popular template engines such as Pug, EJS, Swig, and Handlebars are compatible with Express. However, Express comes with a default template engine, Jade, which is the first released version of Pug.
To demonstrate how to use a Template Engine, we will be using Pug. It is a powerful template engine that provide features such as filters, includes, interpolation, etc. To use it, we have to first install as a module in our project using `npm`.
```shell
npm install --save pug
```
This command will install the pug and to verify that installed correctly, just take a look at the `package.json` file. To use it with our application first we have to set it as the template engine and create a new directory './views' where we will store all the files related to our template engine.
```javascript
app.set('view engine', 'pug');
app.set('views', './views');
```
Since we are using `app.set()` which indicates configuration within our server file, we must place them before we define any route or a middleware function.
In the `views` direcotry, create file called `index.pug`.
```pug
doctype html
html
head
tite="Hello from Pug"
body
p.greetings Hello World!
```
To run this page, we will add the following route to our application.
```javascript
app.get('/hello', (req, res) => {
res.render('index');
});
```
Since we have already set Pug as our template engine, in `res.render` we do not have to provide `.pug` extension. This function renders the code in any `.pug` file to HTML for the client to display. The browsers can only render HTML files. If you start the server now, and visit the route `http://localhost:3000/hello` you will see the output `Hello World` rendered correctly.
In Pug, you must notice that we do not have to write closing tags to elements as we do in HTML. The above code will be rendered into HTML as:
```html
<!DOCTYPE html>
<html>
<head>
<title>Hello from Pug</title>
</head>
<body>
<p class = "greetings">Hello World!</p>
</body>
</html>
```
The advantage of using a Template Engine over raw HTML files is that they provide support for performing tasks over data. HTML cannot render data directly. Frameworks like Angular and React share this behaviour with template engines.
You can also pass values to template engine directly from the route handler function.
```javascript
app.get('/', (req, res) => {
res.render('index', { title: 'Hello from Pug', message: 'Hello World!' });
});
```
For above case, our `index.pug` file will be written as:
```pug
doctype html
html
head
title= title
body
h1= message
```
The output will be the same as previous case.
## Project Structure of an Express App
Since Express does not enforces much on the developer using it, sometimes it can get a bit overwhelming to what project structure one should follow. It does not has a defined structure officially but most common use case that any Node.js based application follows is to separate different tasks in different modules. This means to have separate JavaScript files.
Let us go through a typical strucutre of an Express based web application.
```
project-root/
node_modules/ // This is where the packages installed are stored
config/
db.js // Database connection and configuration
credentials.js // Passwords/API keys for external services used by your app
config.js // Environment variables
models/ // For mongoose schemas
books.js
things.js
routes/ // All routes for different entities in different files
books.js
things.js
views/
index.pug
404.pug
...
public/ // All static files
images/
css/
javascript/
app.js
routes.js // Require all routes in this and then require this file in
app.js
package.json
```
This is pattern is commonly known as MVC, model-view-controller. Simply because our database model, the UI of the application and the controllers (in our case, routes) are written and stored in separate files. This design pattern that makes any web application easy to scale if you want to introduce more routes or static files in the future and the code is maintainable.

View File

@ -0,0 +1,80 @@
## File System
The Node.js File System module allows you to work with the file system on your computer.
Node.js has a set of built-in modules which you can use without any further installation. Similarly **File System module** contains a set of functions which are required to perform different operations on files such as read and write operation.
In order to to include a module, use the ```require()``` function with the name of the module.
```javascript
const fs = require('fs');
```
Common use for the File System module:
* Read files
* Create files
* Update files
* Delete files
* Rename files
## Reading a file
The ```fs.readFile()``` method is used to read file on your computer. It takes three arguments - filename, encoding and a call back function.
Node.js code to read file from your computer and return the content to the console.
```javascript
const fs = require('fs');
fs.readFile('input.txt', 'utf-8', (err, data) => {
if (err) {
console.log(err);
}
else {
console.log("Content present in input.txt file : " + data.toString());
}
});
```
The above code reads a file *input.txt* from your computer and returns the content to the console.
### Steps for execution :
* You should have Node.js installed in your computer.
* Create a file *app.js* and paste the above code.
* Create a file *input.txt* and write some content into it.
* Now open your console in the working directory and execute the command ``` node app.js ```.
*Note* : The input.txt file should be present in the same directory where your Node.js code file is present otherwise it will throw an error.
## Writing in a file
The ```fs.writeFile()``` method takes three arguments - filename, content and a call back function.
Node.js code to write content into file.
```javascript
const fs = require('fs');
fs.writeFile('output.txt', "New content added", (err, data) => {
if (err) {
console.log(err);
}
else {
console.log("The file is saved");
}
});
```
The above code creates a file *output.txt* and add content *New content added* to it.
### Steps for execution :
* You should have Node.js installed in your computer.
* Create a file *app.js* and paste the above code.
* Now open your console in the working directory and execute the command ``` node app.js ```.
*Note* : If file does not exist then the ```fs.writeFile()``` method creates a file and writes the content into it. On the contrary if the file exists then it overwrites the content in the file.
## Resources
* [Node.js API](https://nodejs.org/api/fs.html#fs_file_system)
* [W3 Schools](https://www.w3schools.com/nodejs/nodejs_filesystem.asp)

View File

@ -0,0 +1,42 @@
## HTTP
Node.js has a set of built-in modules which you can use without any further installation. Similarly **HTTP module** contains a set of functions which are required to transfer data over the Hyper Text Transfer Protocol (HTTP).
The HTTP module can create an HTTP server that listens to server ports and gives a response back to the client.
In order to to include a module, use the ```require()``` function with the name of the module.
```javascript
const http = require('http');
```
## Node.js as a Web Server
The ```createServer()``` method is used to create an HTTP server. The first argument of the ```res.writeHead()``` method is the status code, ```200``` means that all is OK, the second argument is an object containing the response headers.
```javascript
const http = require('http');
//create a server object:
http.createServer((req, res) => {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.write('Hello World!'); //write a response to the client
res.end(); //end the response
}).listen(8000); //the server object listens on port 8000
console.log("Server is listening on port no : 8000");
```
### Steps for execution :
* You should have Node.js installed in your computer.
* Create a file *app.js* and paste the above code.
* Now open your console in the working directory and execute the command ``` node app.js ```.
* Open your browser and enter ```http://localhost:8000```
*Note:* In order to close the server then press ```ctrl + C``` in your console for windows users.
## Resources
* [Node.js API](https://nodejs.org/api/http.html#http_http)
* [W3 Schools](https://www.w3schools.com/nodejs/nodejs_http.asp)

View File

@ -0,0 +1,52 @@
---
title: Node.js
---
## Node.js
Node.js is a JavaScript runtime built on Chrome's V8 JavaScript engine. Node.js uses an event-driven, non-blocking I/O model that makes it lightweight and efficient. Node.js' package ecosystem, npm, is the largest ecosystem of open source libraries in the world.
#### Let's break it down.
- Javascript runtime built on Chrome's V8 JavaScript engine.
Every browser has a JavaSript engine built in it to process JavaScript files contained in websites. Google Chrome uses V8 engine which is built using C++. Node.js also uses this super-fast engine to interpret JavaScript files.
- Node.js uses an event-driven model.
This means that Node.js waits for certain events to take place. It then acts on those events. Events can be anything from a click to a HTTP request. We can also declare our own custom events and make node.js listen for those events.
- Node.js uses a non-blocking I/O model.
We know that I/O tasks take much longer than processing tasks. Node.js uses callback functions to handle such requests.
Let us assume that a particular I/O task takes 5 secs to execute.
And we want to perform this I/O twice in our code.
**Python**
```python
import time
def my_io_task():
time.sleep(5)
print("done")
my_io_task()
my_io_task()
```
**Node.js**
```node
function my_io_task() {
setTimeout(function() {
console.log('done');
}, 5000);
}
my_io_task();
my_io_task();
```
Both look similar but the time taken to execute are different. The python code takes 10 seconds to execute while the Node.js code takes only 5 seconds to execute.
Node.js takes less time because of its non-blocking I/O model. The first call to ```my_io_task()``` starts the timer and leaves it there. It does not wait for the response from the function, instead, it moves on to call the second ```my_io_task()```, starts the timer and leaves it there.
When the timer completes it's execution taking 5 seconds, it calls the function and prints ```done``` on the console. Since, both the timers are started together, they complete together and therefore take same amount of time.
#### More information:
- [Official NodeJS site](https://nodejs.org)
- [Node Version Manager](https://github.com/creationix/nvm/blob/master/README.md)
- [n: Interactive NodeJS Version Manager](https://github.com/tj/n)

View File

@ -0,0 +1,345 @@
---
title: NPM
---
## NPM
Node.js makes it possible to write applications in JavaScript on the server. Its built on the V8 JavaScript runtime and written in C++ — so its fast. Originally, it was intended as a server environment for applications, but developers started using it to create modules to aid them in local task automation. Since then, a whole new ecosystem of Node-based tools has evolved to transform the face of front-end development.
To make use of these modules (or packages) in Node.js we need to be able to install and manage them in a useful way. This is where npm, the Node package manager, comes in. It installs the packages you want to use and provides a useful interface to work with them.
## Installing NPM
To install `npm` we have to download Nodejs binaries in your local envrionment. Node.js binaries include the latest version of npm. To verify that:
```shell
npm -v
5.6.0
```
Node Package Manager (NPM) provides two main functionalities
* Online repositories for node.js packages/modules which are searchable on `npmjs.com`.
* Command line utility to install Node.js packages, do version management and dependency management of Node.js packages.
## Installing Modules using NPM
`npm` can install packages in local or global mode. By default, NPM installs any dependency in the local mode. In local mode it installs the package in a node_modules folder in your parent working directory. This location is owned by the current user. Global packages are installed in {prefix}`/lib/node_modules/` which is owned by root, where {prefix} is usually `/usr/ or /usr/local`. This means you would have to use sudo to install packages globally, which could cause permission errors when resolving third-party dependencies, as well as being a security concern.
### Installing Packages in Global Mode
Any packages installed globally will become available from the command line. We use the --global flag or -g to install packages globally.
```shell
$ npm install uglify-js --global
```
We can list the global packages we have installed with the npm list command.
```shell
$ npm list --global
/usr/local/lib
├─┬ npm@5.6.0
│ ├── abbrev@1.1.0
│ ├── ansi-regex@2.1.1
│ ├── ansicolors@0.3.2
│ ├── ansistyles@0.1.3
....................
└─┬ uglify-js@3.0.15
├─┬ commander@2.9.0
│ └── graceful-readlink@1.0.1
└── source-map@0.5.6
```
The output however, is rather verbose. We can change that with the --depth=0 option.
```js
$ npm list -g --depth=0
/usr/local/lib
├── npm@5.6.0
└── uglify-js@3.0.15
```
### Installing Packages in Local Mode
When you install packages locally, you normally do so using a package.json file.
```shell
npm install --save express
```
Now you can use this module in your js file as following
```js
const express = require('express');
```
Local modules are further divided into two types of depenedencies: `devDepenednecies` and `dependencies`. The difference between these two, is that devDependencies are modules which are only required during development, while dependencies are modules which are also required at runtime. To save a dependency as a devDependency on installation we need to do an `npm install --save-dev`, instead of just an `npm install --save`.
A nice shorthand for installing a devDependency that I like to use is `npm i -D`. The shorthand for saving a regular dependency is `-S` instead of `-D`.
We can list the local packages we have installed locally with the npm list command.
```js
$ npm list --depth=0
/usr/local/someDirectory
└── express@4.16.4
```
### Installing a Specific Version of a Package
To do so, we mention the package version we want to install.
```shell
$ npm install underscore@1.8.2 -S
```
To remove a global dependency, use `-g` flag.
### Uninstalling Local Packages
npm is a package manager so it must be able to remove a package. We can remove the package:
```shell
$ npm uninstall underscore -S
```
To update a global dependency, use `-g` flag.
### Updating a Package
To update a package, you can do:
```
$ npm update underscore -S
```
To check if there is an update available for any package associated with our project:
```shell
$ npm outdated
Package Current Wanted Latest Location
underscore 1.8.2 1.8.3 1.8.3 project
```
The Current column shows us the version that is installed locally. The Latest column tells us the latest version of the package. And the Wanted column tells us the latest version of the package we can upgrade to without breaking our existing code.
## Managing Dependencies with package.json
If not using a specific flag and installing a module like `npm install express` will install the module in `node_modules` folder locally but the `package.json` that keep records of every dependency we are using in a project will not be updated with our addition. Thus, the package will be development specific, will not be installed in runtimme environment. Make sure, you always use a proper flag and keep `package.json` file updated.
When you install packages locally, you need a package.json file. To generate one you can do that by using `npm init` command. It will prompt up with some questions that by pressing enter you can keep the default values.
```shell
$ npm init
package name: (project)
version: (1.0.0)
description: Demo of package.json
entry point: (index.js)
test command:
git repository:
keywords:
author:
license: (ISC)
```
Think of `package.json` as the keeper of all dependencies or manifestation of a Node.js project. If you want a quicker way to generate a package.json file use `npm init --y`.
List of Common Attributes in a `package.json` file:
* name name of the package
* version semantic version of the package
* description description of the package
* homepage homepage of the package
* author author of the package
* contributors name of the contributors to the package
* dependencies list of dependencies. NPM automatically installs all the dependencies mentioned here in the node_module folder of the package.
* devDependencies - list of all development specific dependencies
* repository repository type and URL of the package
* main entry point of the package
* keywords keywords
* license - a license for your package so that people know how they are permitted to use it, and any restrictions you're placing on it.
* scripts - The "scripts" property is a dictionary containing script commands that are run at various times in the lifecycle of your package.
* config - object that can be used to set configuration parameters used in package scripts that persist across upgrades.
An example:
```json
{
"name": "express",
"description": "Fast, unopinionated, minimalist web framework",
"version": "4.11.2",
"author": {
"name": "TJ Holowaychuk",
"email": "tj@vision-media.ca"
},
"contributors": [{
"name": "Aaron Heckmann",
"email": "aaron.heckmann+github@gmail.com"
},
],
"license": "MIT", "repository": {
"type": "git",
"url": "https://github.com/strongloop/express"
},
"homepage": "https://expressjs.com/", "keywords": [
"express",
"framework",
"sinatra",
"web",
"rest",
"restful",
"router",
"app",
"api"
],
"dependencies": {
"serve-static": "~1.8.1",
},
"devDependencies": {
"jade": "~1.9.1",
},
"engines": {
"node": ">= 0.10.0"
},
"files": [
"LICENSE",
"History.md",
"Readme.md",
"index.js",
"lib/"
],
"scripts": {
"test": "mocha --require test/support/env
--reporter spec --bail --check-leaks test/ test/acceptance/",
"test-cov": "istanbul cover node_modules/mocha/bin/_mocha
-- --require test/support/env --reporter dot --check-leaks test/ test/acceptance/",
"test-tap": "mocha --require test/support/env
--reporter tap --check-leaks test/ test/acceptance/",
"test-travis": "istanbul cover node_modules/mocha/bin/_mocha
--report lcovonly -- --require test/support/env
--reporter spec --check-leaks test/ test/acceptance/"
},
}
```
## npm Scripts
`npm` scripts are used to automate repetitive tasks. For example, building your project, minifying Cascading Style Sheets (CSS) and JavaScript (JS) files. Scripts are also used in deleting temporary files and folders, etc. They can be customized and are accessible via `scripts` object in `package.json`.
```json
{
"name": "super-cool-package",
"version": "1.0.0",
"scripts": {}
}
```
An example of the most popular NPM script:
```json
"scripts": {
"start": "node index.js",
...
}
```
## npm Cache
When npm installs a package it keeps a copy, so the next time you want to install that package, it doesnt need to hit the network. The copies are cached in the .npm directory in your home path.
```shell
$ ls ~/.npm
lodash.zipobject
log-driver
log-symbols
logalot
logfmt
loglevel
long-timeout
longest
longest-strea
```
This directory will get cluttered with old packages over time, so its useful to clean it up occasionally.
```shell
$ npm cache clean
```
## Yarn - an alternative to npm
Yarn is also a JavaScript package manager developed and maintained by Facebook. Both share high level similarities when it come to using them. It is supposedly to be faster in installing dependencies than npm. To install it:
```shell
npm install -g yarn
```
Yarn doesn't intend to replace npm, more like improving on it. It uses the same package.json file, and saves dependencies to the `node_modules/`folder. To initialise a projcet, you will use:
```shell
yarn init
```
### Adding, Upgrading, and Removing Dependencies
Adding a new dependency is easy and similar to npm:
```shell
yarn add [package-name]
```
If you want a specific package version or tag, you can do this.
```shell
yarn add express@4.14.1
```
For dev dependencies, peer dependencies and optional dependencies you pass the --dev --peer --optional respectively.
```shell
yarn add gulp --dev
```
Will save gulp under devDependencies. To upgrade or remove a package, you just replace the add command with either upgrade or remove followed by the package name.
```shell
# upgrade a gulp from 3.9.1 to version 4
yarn upgrade gulp@4.0
# remove a gulp
yarn remove gulp
```
After every install, upgrade or removal, yarn updates a yarn.lock file which keeps track of the exact package version installed in node_modules directory. Similar feature has been updated in npm. Now there is a `package-lock.json` which behave exactly in the same manner as `yarn.lock` in newer versions of npm.
### Package Version Numbers, and What they Mean
A first release of a npm package is always 1.0.0
Bug fixes, or minor changes increment the third digit, hense 1.0.0 would become 1.0.1
New features that dont break previous versions of a package increment the second digit, hense 1.0.0 would become 1.1.0
All changes that break the previous releases of a package increment the first digit, hense 1.0.0 would become 2.0.0
It is important to remember this when updateing packages to keep your project stable!

View File

@ -0,0 +1,88 @@
---
title: Process Object
---
## Process Object
The `process` object in Node.js is a global object that can be accessed inside any module without requiring it. There are very few global objects or properties provided in Node.js and `process` is one of them. It is an essential component in Node.js ecosystem as it provides various information sets about the runtime of a program. To explore we will use one of its properties which is called `process.versions`. This property tells us the information about Node.js version we have installed. It has to be used with `-p` flag.
```shell
$ node -p "process.versions"
# output
{ http_parser: '2.8.0',
node: '8.11.2',
v8: '6.2.414.54',
uv: '1.19.1',
zlib: '1.2.11',
ares: '1.10.1-DEV',
modules: '57',
nghttp2: '1.29.0',
napi: '3',
openssl: '1.0.2o',
icu: '60.1',
unicode: '10.0',
cldr: '32.0',
tz: '2017c' }
```
Another property you can check is `process.release` that is same to the command `$ node --version` which we used when we installed Node.js but the output this time is going to be more detailed.
```shell
node -p "process.release"
# output
{ name: 'node',
lts: 'Carbon',
sourceUrl: 'https://nodejs.org/download/release/v8.11.2/node-v8.11.2.tar.gz',
headersUrl: 'https://nodejs.org/download/release/v8.11.2/node-v8.11.2-headers.tar.gz' }
```
These are some of the different commands that we can use in a command line to access information otherwise no module can provide. This `process` object is an instance of EventEmitter class and it does it contain its own pre-defined events such as `exit` which can be used to know when a program in Node.js has completed its execution. Run the below program and you can observe that the result comes up with status code `0`. In Node.js this status code means that a program has run successfully.
```js
process.on('exit', code => {
setTimeout(() => {
console.log('Will not get displayed');
}, 0);
console.log('Exited with status code:', code);
});
console.log('Execution Completed');
```
Output of the above program:
```shell
Execution Completed
Exited with status code: 0
```
`Process` also provides various properties to interact with. Some of them can be used in a Node application to provide a gateway to communicate between the Node application and any command line interface. This is very useful if you are building a command line application or utility using Node.js
- process.stdin: a readable stream
- process.stdout: a writable stream
- process.stderr: a wriatable stream to recognize errors
Using `argv` you can always access to arguments that are passed in a command line. `argv` is an array which has Node itself as the first element and the absolute path of the file as the second element. From the third element onwards it can have as many arguments.
Try the below program to get more insight into how you can use these various properties and functions.
```js
process.stdout.write('Hello World!' + '\n');
process.argv.forEach(function(val, index, array) {
console.log(index + ': ' + val);
});
```
If you run the above code with the following command you will get the output and the first two elements are of `argv` are printed.
```shell
$ node test.js
# output
Hello World!
0: /usr/local/bin/node
1: /Users/amanhimself/Desktop/articles/nodejs-text-tuts/test.js
```

View File

@ -0,0 +1,30 @@
---
title: Socket.io
---
## Socket.io
[Socket.io](https://socket.io/) is a Node.js library made to help make real-time communication between computers possible. To ensure this Socket.io uses a combination of WebSockets and Polling to establish a connection between the client's browser and the server. This allows it to work on older browsers that may not support pure websockets. This library uses [Engine.IO](https://github.com/socketio/engine.io) for building the connection.
### Demos
To get a taste of what is possible, Socket.io provides two demos to show it's possible use-cases. You can find the demos at [https://socket.io/demos/chat/](https://socket.io/demos/chat/) and find the link to the whiteboard demo on the left.
### Get Started
Since Socket.io is a Node.js library you have to make sure that Node.js is installed.
If it's not set up yet get the latest version at [Nodejs.org](https://nodejs.org/)
#### macOS
Node.js can also be installed via [Homebrew](https://brew.sh/) a package manager for macOS.
Just type `brew install node` to install Node.js.
A [get started](https://socket.io/get-started/chat/) guide can also be found on Socket.io's page. It shows how to easily build a real-time chat in just a couple of lines.
#### More information
More information about Socket.io and it's documentation can be found at:
- [Socket.io](https://socket.io/)
- [Socket.io Documentation](https://socket.io/docs/)

View File

@ -0,0 +1,159 @@
---
title: Streams
---
## Streams
Streams are available in Node.js core API as objects that allow the data to read or write in a continuous way. Basically, a stream does that in chunks in comparison to buffer which does its bit by bit, thus making it a slow process.
There are four types of streams available:
* Readable (streams from which data is read)
* Writable (streams to which data is written)
* Duplex (streams that are both Readable and Writable)
* Transform (Duplex Streams that can modify data as it is read and written)
Each available type has several methods associated. Some of the common ones are:
* data (this runs when data is available)
* end (this gets triggered when there is no data left to read)
* error (this runs when there is an error either receiving or writing data)
### Pipe
In programming, the concept of `pipe` is not new. Unix based systems have been pragmatically using it since the 1970s. What does a Pipe do? A `pipe` generally connects the source and the destination. It passes the output of one function as the input of another function.
In Node.js, `pipe` is used the same way, to pair inputs and outputs of different operations. `pipe()` is available as a function that takes a readable source stream and attaches the output to a destination stream. The general syntax can be represented as:
```javascript
src.pipe(dest);
```
Multiple `pipe()` functions can also be chained together.
```javascript
a.pipe(b).pipe(c);
// which is equivalent to
a.pipe(b);
b.pipe(c);
```
### Readable Streams
Streams that produce data that can be attached as the input to a writable stream is known as a Readable stream. To create a readable stream:
```javascript
const { Readable } = require('stream');
const readable = new Readable();
readable.on('data', chunk => {
console.log(`Received ${chunk.length} bytes of data.`);
});
readable.on('end', () => {
console.log('There will be no more data.');
});
```
### Writable Stream
This is the type of a stream that you can `pipe()` the data to from a readable source. To create a writable stream, we have a constructor approach. We create an object from it and pass a number of options. The method takes three arguments:
* chunk: a buffer
* encoding: to convert data to human readable form
* callback: a function that is called when the data is done processing from the chunk
```javascript
const { Writable } = require('stream');
const writable = new Writable({
write(chunk, encoding, callback) {
console.log(chunk.toString());
callback();
}
});
process.stdin.pipe(writable);
```
### Duplex Streams
Duplex streams help us to implement both readable and writable streams at the same time.
```javascript
const { Duplex } = require('stream');
const inoutStream = new Duplex({
write(chunk, encoding, callback) {
console.log(chunk.toString());
callback();
},
read(size) {
this.push(String.fromCharCode(this.currentCharCode++));
if (this.currentCharCode > 90) {
this.push(null);
}
}
});
inoutStream.currentCharCode = 65;
process.stdin.pipe(inoutStream).pipe(process.stdout);
```
The `stdin` stream pipes the readable data into the duplex stream. The `stdout` helps us to see the data. The readable and writable parts of a duplex stream operate completely independent of each other.
### Transform Stream
This type of stream is more of an advanced version of the duplex stream.
```javascript
const { Transform } = require('stream');
const upperCaseTr = new Transform({
transform(chunk, encoding, callback) {
this.push(chunk.toString().toUpperCase());
callback();
}
});
process.stdin.pipe(upperCaseTr).pipe(process.stdout);
```
The data we are consuming is same as the previous example of the duplex stream. The thing to notice here is that `transform()` does not require implementation of `read` or `write` methods. It combines both the methods itself.
### Why use Streams?
Since Node.js is asynchronous so interacting by passing callbacks to functions with disk and network. An example given below reads the data from a file on the disk and responds it to over the network request from client.
```javascript
const http = require('http');
const fs = require('fs');
const server = http.createServer((req, res) => {
fs.readFile('data.txt', (err, data) => {
res.end(data);
});
});
server.listen(8000);
```
The above snippet of code will work but the entire data from the file will first go into the memory for every request before writing the result back to the client request. If the file we are reading is too large, this can become a very heavy and expensive server call as it will consume a lot of memory for the process to advance. The user experience on the client side will also suffer from delay.
In this case, if we use streams, the data will be send to the client request as one chunk at a time as soon as they received from the disk.
```javascript
const http = require('http');
const fs = require('fs');
const server = http.createServer((req, res) => {
const stream = fs.createReadStream('data.txt');
stream.pipe(res);
});
server.listen(8000);
```
The `pipe()` here takes care of writing or in our case, sending the data with response object and once all the data is read from the file, to close the connection.
Note: `process.stdin` and `process.stdout` are build in streams in the global `process` object provided by Node.js API.