Compare commits

...

163 Commits

Author SHA1 Message Date
Kamil Myśliwiec
bcd4ea93b1 chore(@nestjs) publish v7.6.3 release 2020-12-17 12:29:07 +01:00
Kamil Myśliwiec
0cc464519f Merge branch 'master' of https://github.com/nestjs/nest 2020-12-17 12:23:26 +01:00
Kamil Myśliwiec
bacef33be1 fix(core): properly serialise symbols in dynamic modules #5964 2020-12-17 12:23:17 +01:00
Kamil Mysliwiec
c624c170dd Merge pull request #5915 from nestjs/fix/5701-await-kafka-response
fix(microservices): await respond callback, fix kafka (missing second-to-last item)
2020-12-17 12:19:56 +01:00
Kamil Mysliwiec
d9c4162bb2 Merge pull request #5929 from nestjs/renovate/typescript-4.x
chore(deps): update dependency typescript to v4.1.3
2020-12-17 12:19:45 +01:00
Kamil Mysliwiec
fc557353be Merge pull request #5963 from nestjs/renovate/light-my-request-4.x
fix(deps): update dependency light-my-request to v4.4.1
2020-12-17 12:02:06 +01:00
Renovate Bot
6c800fd2e3 fix(deps): update dependency light-my-request to v4.4.1 2020-12-17 10:28:33 +00:00
Kamil Mysliwiec
cee29d9406 Merge pull request #5962 from nestjs/renovate/nest-monorepo
fix(deps): update nest monorepo to v7.6.2
2020-12-17 09:37:17 +01:00
Renovate Bot
7d7cf01813 fix(deps): update nest monorepo to v7.6.2 2020-12-17 08:11:17 +00:00
Kamil Myśliwiec
a05ff8b6b0 chore(@nestjs) publish v7.6.2 release 2020-12-17 08:45:38 +01:00
Kamil Myśliwiec
b63cd49f83 Merge branch 'master' of https://github.com/nestjs/nest 2020-12-17 08:44:33 +01:00
Kamil Myśliwiec
be6c85c88d fix(common): fix undefined get timestamp method (logger) 2020-12-17 08:44:11 +01:00
Kamil Mysliwiec
93acb0af17 Merge pull request #5955 from nestjs/renovate/fastify-3.x
fix(deps): update dependency fastify to v3.9.2
2020-12-16 14:35:46 +01:00
Renovate Bot
a80547e4a6 chore(deps): update dependency typescript to v4.1.3 2020-12-16 13:33:33 +00:00
Renovate Bot
16246feddd fix(deps): update dependency fastify to v3.9.2 2020-12-16 13:09:59 +00:00
Kamil Mysliwiec
1a43fd05e8 Merge pull request #5932 from nestjs/renovate/node-14.x
chore(deps): update dependency @types/node to v14.14.14
2020-12-16 14:04:19 +01:00
Kamil Mysliwiec
24ce1dd6de Merge pull request #5954 from nestjs/renovate/nestjs-swagger-4.x
fix(deps): update dependency @nestjs/swagger to v4.7.7
2020-12-16 11:47:48 +01:00
Renovate Bot
51a2e48a4a fix(deps): update dependency @nestjs/swagger to v4.7.7 2020-12-16 10:16:01 +00:00
Renovate Bot
a3f32a342d chore(deps): update dependency @types/node to v14.14.14 2020-12-16 07:54:01 +00:00
Kamil Mysliwiec
9f96855a46 Merge pull request #5949 from nestjs/renovate/webpack-5.x
chore(deps): update dependency webpack to v5.10.3
2020-12-16 08:34:46 +01:00
Kamil Mysliwiec
e2e2a8f746 Merge pull request #5934 from nestjs/renovate/mongoose-5.x
fix(deps): update dependency mongoose to v5.11.8
2020-12-16 08:34:10 +01:00
Kamil Mysliwiec
094fc36115 Merge pull request #5938 from nestjs/dependabot/npm_and_yarn/typescript-4.1.3
chore(deps-dev): bump typescript from 4.1.2 to 4.1.3
2020-12-16 08:34:02 +01:00
Kamil Mysliwiec
854bf58910 Merge pull request #5951 from nestjs/renovate/babel-monorepo
chore(deps): update dependency @babel/preset-env to v7.12.11
2020-12-16 08:33:52 +01:00
Kamil Mysliwiec
5cba807a98 Merge pull request #5953 from nestjs/dependabot/npm_and_yarn/types/node-14.14.14
chore(deps-dev): bump @types/node from 14.14.13 to 14.14.14
2020-12-16 08:33:45 +01:00
dependabot[bot]
8df77f8f45 chore(deps-dev): bump @types/node from 14.14.13 to 14.14.14
Bumps [@types/node](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/node) from 14.14.13 to 14.14.14.
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/node)

Signed-off-by: dependabot[bot] <support@github.com>
2020-12-16 05:27:26 +00:00
Renovate Bot
a796bfabea chore(deps): update dependency @babel/preset-env to v7.12.11 2020-12-16 00:09:19 +00:00
Renovate Bot
f6f6f917ad chore(deps): update dependency webpack to v5.10.3 2020-12-15 20:16:52 +00:00
Renovate Bot
f5770bf13b fix(deps): update dependency mongoose to v5.11.8 2020-12-15 10:02:46 +00:00
dependabot[bot]
1a4530e669 chore(deps-dev): bump typescript from 4.1.2 to 4.1.3
Bumps [typescript](https://github.com/Microsoft/TypeScript) from 4.1.2 to 4.1.3.
- [Release notes](https://github.com/Microsoft/TypeScript/releases)
- [Commits](https://github.com/Microsoft/TypeScript/commits)

Signed-off-by: dependabot[bot] <support@github.com>
2020-12-15 08:06:10 +00:00
Kamil Mysliwiec
2b1529923e Merge pull request #5930 from nestjs/renovate/webpack-5.x
chore(deps): update dependency webpack to v5.10.1
2020-12-15 09:04:08 +01:00
Kamil Mysliwiec
69e229774c Merge pull request #5947 from nestjs/dependabot/npm_and_yarn/clang-format-1.5.0
chore(deps-dev): bump clang-format from 1.4.0 to 1.5.0
2020-12-15 09:03:54 +01:00
Kamil Mysliwiec
82eb97401b Merge pull request #5948 from nestjs/dependabot/npm_and_yarn/mongoose-5.11.8
chore(deps-dev): bump mongoose from 5.11.7 to 5.11.8
2020-12-15 09:03:47 +01:00
Kamil Mysliwiec
5134ccd933 Merge pull request #5944 from nestjs/renovate/typescript-eslint-monorepo
chore(deps): update typescript-eslint monorepo to v4.10.0
2020-12-15 09:03:26 +01:00
dependabot[bot]
e23970d748 chore(deps-dev): bump mongoose from 5.11.7 to 5.11.8
Bumps [mongoose](https://github.com/Automattic/mongoose) from 5.11.7 to 5.11.8.
- [Release notes](https://github.com/Automattic/mongoose/releases)
- [Changelog](https://github.com/Automattic/mongoose/blob/master/History.md)
- [Commits](https://github.com/Automattic/mongoose/compare/5.11.7...5.11.8)

Signed-off-by: dependabot[bot] <support@github.com>
2020-12-15 05:38:57 +00:00
dependabot[bot]
3102db3aa0 chore(deps-dev): bump clang-format from 1.4.0 to 1.5.0
Bumps [clang-format](https://github.com/angular/clang-format) from 1.4.0 to 1.5.0.
- [Release notes](https://github.com/angular/clang-format/releases)
- [Commits](https://github.com/angular/clang-format/commits)

Signed-off-by: dependabot[bot] <support@github.com>
2020-12-15 05:34:54 +00:00
Renovate Bot
452ffb4125 chore(deps): update typescript-eslint monorepo to v4.10.0 2020-12-14 18:24:07 +00:00
Renovate Bot
5a33605808 chore(deps): update dependency webpack to v5.10.1 2020-12-14 12:06:18 +00:00
Kamil Mysliwiec
51a5de1287 Merge pull request #5939 from nestjs/dependabot/npm_and_yarn/types/node-14.14.13
chore(deps-dev): bump @types/node from 14.14.12 to 14.14.13
2020-12-14 12:51:21 +01:00
Kamil Mysliwiec
a64116b2bf Merge pull request #5936 from nestjs/dependabot/npm_and_yarn/husky-4.3.6
chore(deps-dev): bump husky from 4.3.5 to 4.3.6
2020-12-14 12:28:55 +01:00
Kamil Mysliwiec
1426d4c860 Merge pull request #5940 from nestjs/dependabot/npm_and_yarn/sinon-9.2.2
chore(deps-dev): bump sinon from 9.2.1 to 9.2.2
2020-12-14 12:28:49 +01:00
dependabot[bot]
24113d9de9 chore(deps-dev): bump @types/node from 14.14.12 to 14.14.13
Bumps [@types/node](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/node) from 14.14.12 to 14.14.13.
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/node)

Signed-off-by: dependabot[bot] <support@github.com>
2020-12-14 10:52:35 +00:00
Kamil Mysliwiec
33e2c65892 Merge pull request #5941 from nestjs/renovate/nestjs-swagger-4.x
fix(deps): update dependency @nestjs/swagger to v4.7.6
2020-12-14 11:50:19 +01:00
Kamil Mysliwiec
c21a1526ad Merge pull request #5937 from nestjs/dependabot/npm_and_yarn/types/mongoose-5.10.3
chore(deps-dev): bump @types/mongoose from 5.10.2 to 5.10.3
2020-12-14 11:50:03 +01:00
dependabot[bot]
f1d1ab8a03 chore(deps-dev): bump husky from 4.3.5 to 4.3.6
Bumps [husky](https://github.com/typicode/husky) from 4.3.5 to 4.3.6.
- [Release notes](https://github.com/typicode/husky/releases)
- [Commits](https://github.com/typicode/husky/compare/v4.3.5...v4.3.6)

Signed-off-by: dependabot[bot] <support@github.com>
2020-12-14 10:26:14 +00:00
dependabot[bot]
0d902753c9 chore(deps-dev): bump @types/mongoose from 5.10.2 to 5.10.3
Bumps [@types/mongoose](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/mongoose) from 5.10.2 to 5.10.3.
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/mongoose)

Signed-off-by: dependabot[bot] <support@github.com>
2020-12-14 10:26:12 +00:00
dependabot[bot]
ef172c3e21 chore(deps-dev): bump sinon from 9.2.1 to 9.2.2
Bumps [sinon](https://github.com/sinonjs/sinon) from 9.2.1 to 9.2.2.
- [Release notes](https://github.com/sinonjs/sinon/releases)
- [Changelog](https://github.com/sinonjs/sinon/blob/master/CHANGELOG.md)
- [Commits](https://github.com/sinonjs/sinon/commits/v9.2.2)

Signed-off-by: dependabot[bot] <support@github.com>
2020-12-14 10:26:05 +00:00
Renovate Bot
ae6f24b9fe fix(deps): update dependency @nestjs/swagger to v4.7.6 2020-12-14 10:25:41 +00:00
Kamil Myśliwiec
8db37bdbbe Merge branch 'master' of https://github.com/nestjs/nest 2020-12-11 13:33:15 +01:00
Kamil Myśliwiec
0b21ccb216 test(core): fix injector unit tests 2020-12-11 13:33:02 +01:00
Kamil Mysliwiec
a02be476fc Merge pull request #5926 from nestjs/renovate/nestjs-bull-0.x
fix(deps): update dependency @nestjs/bull to v0.3.1
2020-12-11 13:22:20 +01:00
Renovate Bot
b2e964a611 fix(deps): update dependency @nestjs/bull to v0.3.1 2020-12-11 12:13:36 +00:00
Kamil Mysliwiec
594ea3a734 Merge pull request #5914 from nestjs/renovate/node-14.x
chore(deps): update dependency @types/node to v14.14.12
2020-12-11 13:11:35 +01:00
Kamil Myśliwiec
bcc3fd3881 fix(core): fix injection of transient providers to middleware #5427 2020-12-11 12:24:51 +01:00
Renovate Bot
0dd86eb730 chore(deps): update dependency @types/node to v14.14.12 2020-12-11 11:10:21 +00:00
Kamil Myśliwiec
3dbf5f81f2 refactor(core): a few tweaks to the injector class 2020-12-11 11:39:42 +01:00
Kamil Mysliwiec
495de37512 Merge pull request #5918 from nestjs/renovate/jest-26.x
chore(deps): update dependency @types/jest to v26.0.19
2020-12-11 11:32:10 +01:00
Kamil Myśliwiec
9628c2dd76 Merge branch 'master' into fix/5701-await-kafka-response 2020-12-11 11:19:37 +01:00
Kamil Myśliwiec
85e50eb8e7 test(microservices): mock send method to return a promise (kafka client) 2020-12-11 11:19:24 +01:00
Kamil Myśliwiec
44f039820d Merge branch 'master' into fix/5701-await-kafka-response 2020-12-11 11:03:44 +01:00
Kamil Myśliwiec
b0e37bb746 Merge branch 'master' of https://github.com/nestjs/nest 2020-12-11 11:03:31 +01:00
Kamil Myśliwiec
9d0c96e953 fix(microservices): catch async errors when trying to send a message (kafka) 2020-12-11 11:03:19 +01:00
Renovate Bot
a3366e11a5 chore(deps): update dependency @types/jest to v26.0.19 2020-12-11 09:28:20 +00:00
Kamil Mysliwiec
e06be8fb30 Merge pull request #5891 from nestjs/renovate/kafkajs-1.x
chore(deps): update dependency kafkajs to v1.15.0
2020-12-11 09:40:56 +01:00
Kamil Mysliwiec
5da76830fe Merge pull request #5898 from nestjs/dependabot/npm_and_yarn/typescript-eslint/parser-4.9.1
chore(deps-dev): bump @typescript-eslint/parser from 4.9.0 to 4.9.1
2020-12-11 09:40:46 +01:00
Kamil Mysliwiec
8409a18420 Merge pull request #5913 from nestjs/renovate/nest-monorepo
chore(deps): update nest monorepo
2020-12-11 09:40:25 +01:00
Kamil Mysliwiec
413ca1990e Merge pull request #5923 from nestjs/dependabot/npm_and_yarn/types/cors-2.8.9
chore(deps-dev): bump @types/cors from 2.8.8 to 2.8.9
2020-12-11 09:28:21 +01:00
Kamil Mysliwiec
b10c3cb30e Merge pull request #5924 from nestjs/renovate/ts-loader-8.x
chore(deps): update dependency ts-loader to v8.0.12
2020-12-11 09:28:08 +01:00
Renovate Bot
61ee778554 chore(deps): update nest monorepo 2020-12-11 08:21:56 +00:00
Kamil Mysliwiec
ceb7ff8628 Merge pull request #5921 from nestjs/dependabot/npm_and_yarn/types/node-14.14.12
chore(deps-dev): bump @types/node from 14.14.11 to 14.14.12
2020-12-11 09:07:48 +01:00
dependabot[bot]
192ae54de0 chore(deps-dev): bump @typescript-eslint/parser from 4.9.0 to 4.9.1
Bumps [@typescript-eslint/parser](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/parser) from 4.9.0 to 4.9.1.
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/master/packages/parser/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v4.9.1/packages/parser)

Signed-off-by: dependabot[bot] <support@github.com>
2020-12-11 07:59:28 +00:00
Kamil Mysliwiec
66a1231a0c Merge pull request #5919 from nestjs/renovate/mongoose-5.x
fix(deps): update dependency mongoose to v5.11.7
2020-12-11 08:56:43 +01:00
Renovate Bot
09f5af366a chore(deps): update dependency ts-loader to v8.0.12 2020-12-11 06:33:46 +00:00
dependabot[bot]
d8d71b0c15 chore(deps-dev): bump @types/cors from 2.8.8 to 2.8.9
Bumps [@types/cors](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/cors) from 2.8.8 to 2.8.9.
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/cors)

Signed-off-by: dependabot[bot] <support@github.com>
2020-12-11 05:17:01 +00:00
dependabot[bot]
a8186d8160 chore(deps-dev): bump @types/node from 14.14.11 to 14.14.12
Bumps [@types/node](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/node) from 14.14.11 to 14.14.12.
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/node)

Signed-off-by: dependabot[bot] <support@github.com>
2020-12-11 05:12:30 +00:00
Renovate Bot
8d3f139286 fix(deps): update dependency mongoose to v5.11.7 2020-12-11 00:42:31 +00:00
Kamil Myśliwiec
bbf091c77c chore(@nestjs) publish v7.6.1 release 2020-12-10 12:34:14 +01:00
Kamil Myśliwiec
192d22f5e0 fix(common): add validate path alias for add leading slash fn (regression) 2020-12-10 12:33:04 +01:00
Renovate Bot
2ec1281b17 chore(deps): update dependency kafkajs to v1.15.0 2020-12-10 10:39:32 +00:00
Kamil Myśliwiec
b8a9772d51 test(microservices): fix server kafka unit tests 2020-12-10 11:30:58 +01:00
Kamil Myśliwiec
021c32380b fix(microservices): await respond callback, return promise to await (kafka) #5701 2020-12-10 10:53:49 +01:00
Kamil Mysliwiec
72cd8ff732 Merge pull request #5906 from nestjs/renovate/mongoose-5.x
fix(deps): update dependency mongoose to v5.11.6
2020-12-10 10:29:05 +01:00
Renovate Bot
fdb93219ad fix(deps): update dependency mongoose to v5.11.6 2020-12-10 09:18:40 +00:00
Kamil Myśliwiec
a9451b64fc chore(@nestjs) publish v7.6.0 release 2020-12-10 10:10:11 +01:00
Kamil Mysliwiec
d36c938548 Merge pull request #5908 from nestjs/renovate/babel-monorepo
chore(deps): update babel monorepo to v7.12.10
2020-12-10 09:47:57 +01:00
Kamil Mysliwiec
63048baf99 Merge pull request #5910 from nestjs/dependabot/npm_and_yarn/mongoose-5.11.6
chore(deps-dev): bump mongoose from 5.11.5 to 5.11.6
2020-12-10 09:47:49 +01:00
Kamil Mysliwiec
a52001dfb9 Merge pull request #5911 from nestjs/dependabot/npm_and_yarn/fastify-cors-5.1.0
chore(deps): bump fastify-cors from 5.0.0 to 5.1.0
2020-12-10 09:47:37 +01:00
dependabot[bot]
bb38257d13 chore(deps): bump fastify-cors from 5.0.0 to 5.1.0
Bumps [fastify-cors](https://github.com/fastify/fastify-cors) from 5.0.0 to 5.1.0.
- [Release notes](https://github.com/fastify/fastify-cors/releases)
- [Commits](https://github.com/fastify/fastify-cors/compare/v5.0.0...v5.1.0)

Signed-off-by: dependabot[bot] <support@github.com>
2020-12-10 05:30:47 +00:00
dependabot[bot]
703afcb1ba chore(deps-dev): bump mongoose from 5.11.5 to 5.11.6
Bumps [mongoose](https://github.com/Automattic/mongoose) from 5.11.5 to 5.11.6.
- [Release notes](https://github.com/Automattic/mongoose/releases)
- [Changelog](https://github.com/Automattic/mongoose/blob/master/History.md)
- [Commits](https://github.com/Automattic/mongoose/compare/5.11.5...5.11.6)

Signed-off-by: dependabot[bot] <support@github.com>
2020-12-10 05:27:59 +00:00
Renovate Bot
80f5a9ce48 chore(deps): update babel monorepo to v7.12.10 2020-12-09 22:51:30 +00:00
Kamil Myśliwiec
5a23bc1749 Merge branch 'andreialecu-andreialecu-patch-1' 2020-12-09 14:37:40 +01:00
Kamil Myśliwiec
bb6b643645 fix(): temporarily revert #5790 (breaking changes) 2020-12-09 14:37:26 +01:00
Kamil Myśliwiec
710b385a32 chore(): merge master 2020-12-09 14:11:38 +01:00
Kamil Mysliwiec
e78f364ef4 Merge pull request #5890 from nestjs/renovate/engine.io-client-4.x
chore(deps): update dependency engine.io-client to v4.0.5
2020-12-09 14:01:19 +01:00
Kamil Mysliwiec
23ce8d03e4 Merge pull request #5892 from nestjs/renovate/node-14.x
chore(deps): update dependency @types/node to v14.14.11
2020-12-09 14:01:06 +01:00
Kamil Myśliwiec
4c4d8039b8 Merge branch 'master' of https://github.com/nestjs/nest 2020-12-09 13:56:00 +01:00
Kamil Myśliwiec
016c6b554b fix(microservices): remove duplicated imports 2020-12-09 13:55:48 +01:00
Renovate Bot
3f5aad635b chore(deps): update dependency engine.io-client to v4.0.5 2020-12-09 08:21:00 +00:00
Renovate Bot
1c500b4bfc chore(deps): update dependency @types/node to v14.14.11 2020-12-09 08:08:06 +00:00
Kamil Mysliwiec
cad68b6d46 Merge pull request #5894 from nestjs/renovate/jest-26.x
chore(deps): update dependency @types/jest to v26.0.18
2020-12-09 08:43:37 +01:00
Kamil Mysliwiec
a22abe53b5 Merge pull request #5886 from nestjs/renovate/pin-dependencies
fix(deps): pin dependencies
2020-12-09 08:34:20 +01:00
Kamil Mysliwiec
65afede473 Merge pull request #5896 from nestjs/dependabot/npm_and_yarn/types/mocha-8.2.0
chore(deps-dev): bump @types/mocha from 8.0.4 to 8.2.0
2020-12-09 08:33:33 +01:00
Kamil Mysliwiec
171b53c3fb Merge pull request #5897 from nestjs/dependabot/npm_and_yarn/types/node-14.14.11
chore(deps-dev): bump @types/node from 14.14.10 to 14.14.11
2020-12-09 08:33:26 +01:00
Kamil Mysliwiec
8875d431b4 Merge pull request #5900 from nestjs/dependabot/npm_and_yarn/nestjs/mongoose-7.2.0
chore(deps-dev): bump @nestjs/mongoose from 7.1.2 to 7.2.0
2020-12-09 08:33:02 +01:00
Kamil Mysliwiec
0d078a0188 Merge pull request #5901 from nestjs/dependabot/npm_and_yarn/uuid-8.3.2
chore(deps): bump uuid from 8.3.1 to 8.3.2
2020-12-09 08:32:54 +01:00
dependabot[bot]
4dd0a1344e chore(deps): bump uuid from 8.3.1 to 8.3.2
Bumps [uuid](https://github.com/uuidjs/uuid) from 8.3.1 to 8.3.2.
- [Release notes](https://github.com/uuidjs/uuid/releases)
- [Changelog](https://github.com/uuidjs/uuid/blob/master/CHANGELOG.md)
- [Commits](https://github.com/uuidjs/uuid/compare/v8.3.1...v8.3.2)

Signed-off-by: dependabot[bot] <support@github.com>
2020-12-09 05:44:32 +00:00
dependabot[bot]
585ead7f24 chore(deps-dev): bump @nestjs/mongoose from 7.1.2 to 7.2.0
Bumps [@nestjs/mongoose](https://github.com/nestjs/mongoose) from 7.1.2 to 7.2.0.
- [Release notes](https://github.com/nestjs/mongoose/releases)
- [Changelog](https://github.com/nestjs/mongoose/blob/master/.release-it.json)
- [Commits](https://github.com/nestjs/mongoose/compare/7.1.2...7.2.0)

Signed-off-by: dependabot[bot] <support@github.com>
2020-12-09 05:40:49 +00:00
dependabot[bot]
be64b51796 chore(deps-dev): bump @types/node from 14.14.10 to 14.14.11
Bumps [@types/node](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/node) from 14.14.10 to 14.14.11.
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/node)

Signed-off-by: dependabot[bot] <support@github.com>
2020-12-09 05:32:12 +00:00
dependabot[bot]
55a2efd883 chore(deps-dev): bump @types/mocha from 8.0.4 to 8.2.0
Bumps [@types/mocha](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/mocha) from 8.0.4 to 8.2.0.
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/mocha)

Signed-off-by: dependabot[bot] <support@github.com>
2020-12-09 05:28:58 +00:00
Renovate Bot
338be4c8d0 chore(deps): update dependency @types/jest to v26.0.18 2020-12-08 21:14:22 +00:00
Christian Allred
b0afb1a63a Merge branch 'master' into master 2020-12-08 13:50:14 -07:00
Christian Allred
59cd40dd56 Revert "fixing option settings in integration tests"
This reverts commit bd988169348670994f2eb3666d78a9a9a1ebfc57.
2020-12-08 13:47:18 -07:00
Christian Allred
685c685bd7 fixing option settings in integration tests 2020-12-08 13:47:13 -07:00
Christian Allred
be2384c4c7 small update 2020-12-08 13:47:02 -07:00
Christian Allred
e7ef9d9c70 Update to Queue Options for rmqlib 2020-12-08 13:47:02 -07:00
Christian Allred
fc5b440da2 updated the typings for rmq socket options to match the interface provided by connection manager 2020-12-08 13:47:02 -07:00
Christian Allred
998d3de9e6 made a dumb copy paste error here. 2020-12-08 13:47:01 -07:00
Christian Allred
e766276922 removing comment 2020-12-08 13:47:01 -07:00
Christian Allred
2abb426d99 added a link for reference 2020-12-08 13:47:01 -07:00
Christian Allred
4fc147f6e0 https://github.com/nestjs/nest/issues/5788 2020-12-08 13:47:01 -07:00
Renovate Bot
63337763a2 fix(deps): pin dependencies 2020-12-08 16:32:49 +00:00
Kamil Myśliwiec
ab6960da62 Merge branch 'master' of https://github.com/nestjs/nest 2020-12-08 17:21:20 +01:00
Kamil Myśliwiec
a0ded7ad68 test(e2e): fix e2e tests 2020-12-08 17:21:13 +01:00
Kamil Mysliwiec
15aca2bfef Merge pull request #5885 from nestjs/chore/update-sample-deps
chore(): update samples dependencies
2020-12-08 14:55:43 +01:00
Kamil Myśliwiec
9df5146082 sample(): downgrade socket.io-redis dep 2020-12-08 14:07:19 +01:00
Kamil Myśliwiec
c79685c88b Merge branch 'master' into chore/update-sample-deps 2020-12-08 14:05:31 +01:00
Kamil Myśliwiec
9c8cf3978a test(): fix promise type arguments (void) 2020-12-08 14:02:25 +01:00
Kamil Myśliwiec
8c9ee77fb8 Merge branch 'mkaufmaner-2984-kafka-reply-partitions' 2020-12-08 13:34:25 +01:00
Kamil Myśliwiec
fb3db3f6bf chore(): minor tweaks, align to the rest of the codebase 2020-12-08 13:34:10 +01:00
Kamil Myśliwiec
8c503d3193 chore(): update typescript to the latest version 2020-12-08 13:27:51 +01:00
Kamil Myśliwiec
b65f1be3f2 Merge branch 'vinayak25-master' 2020-12-08 13:07:20 +01:00
Kamil Myśliwiec
9a64742f61 refactor(core): inline logger condition 2020-12-08 13:07:01 +01:00
Kamil Myśliwiec
1e32627d17 scripts(): only generate package lock (update samples script) 2020-12-08 13:01:48 +01:00
Kamil Myśliwiec
e4243d55c2 Merge branch 'master' of https://github.com/vinayak25/nest into vinayak25-master 2020-12-08 12:59:36 +01:00
Kamil Myśliwiec
684c5f4af7 Merge branch 'master' of https://github.com/nestjs/nest 2020-12-08 12:34:56 +01:00
Kamil Myśliwiec
51e3559995 fix(): fix sse stream unit test 2020-12-08 12:34:51 +01:00
Kamil Myśliwiec
beb8fe36fc chore(): update samples dependencies 2020-12-08 12:31:55 +01:00
Kamil Mysliwiec
2a4dcc1964 Merge pull request #5863 from coder-freestyle/fix-specify-property-name-for-validate-nested
feat(common): specify parent property for validation of nested objects
2020-12-08 11:38:25 +01:00
Kamil Myśliwiec
e7150ba5a9 Merge branch 'miZyind-controller-path-alias' 2020-12-08 11:35:29 +01:00
Karan Gupta
a0a3fb96cf feat(common): specify parent property for validation of nested objects
Passed the complete path of the parent property as an argument. Earlier, the information regarding the parent was getting lost due to flattening of array in case of validation of nested objects.

Closes #5380
2020-12-05 16:13:45 +05:30
Vinayak Sarawagi
7ef474e74e REVERT isNil in applyLogger method 2020-11-30 14:07:13 +05:30
Vinayak Sarawagi
8dc4967721 FIX options.logger bug 2020-11-30 14:02:05 +05:30
Vinayak Sarawagi
419167ad0f fix #5816 2020-11-28 20:17:13 +05:30
Michael Kaufman
35add88251 feat(microservices): Tests for Kafka reply partitioner 2020-11-21 21:24:55 -05:00
Michael Kaufman
0f80068198 feat(microservices): Increase timeout for kafka tests 2020-11-21 20:33:37 -05:00
Michael Kaufman
32b19c7c37 feat(microservices): Uses kafkajs package for tests 2020-11-21 15:13:26 -05:00
Michael Kaufman
827f1e7f77 feat(microservices): Waits for kafka server to close 2020-11-21 14:46:39 -05:00
Michael Kaufman
404a916129 feat(microservices): Fixes dependencies 2020-11-21 14:38:49 -05:00
Michael Kaufman
1e1a52eb20 feat(microservices): Updates package lock 2020-11-21 14:29:46 -05:00
Michael Kaufman
c67b58f527 feat(microservices): Begins updating the reply partitioner test 2020-11-21 14:26:53 -05:00
Michael Kaufman
4763fb9552 feat(microservices): Removes unused exception and updates tests 2020-11-21 14:07:23 -05:00
Michael Kaufman
68ad994f2d feat(microservices): Updates external kafka interface 2020-11-21 13:40:17 -05:00
Michael Kaufman
0f15e35f2b feat(microservices): Removes redundant kafka external interface 2020-11-21 13:32:40 -05:00
Michael Kaufman
25ca219254 feat(microservices): Completed tests for concurrency 2020-11-21 12:50:25 -05:00
Michael Kaufman
5e21c9e51e feat(microservices): Only stores min partition 2020-11-04 09:40:33 -05:00
Michael Kaufman
b0a0623cf8 feat(microservices): Fixes assigner algo and uses min partition 2020-11-03 16:57:43 -05:00
Michael Kaufman
9449d612aa feat(microservices): Removes assignment store injects client 2020-11-03 09:12:36 -05:00
Michael Kaufman
15473e4251 feat(microservices): Fixes changes to kafka client 2020-11-02 18:49:55 -05:00
Michael Kaufman
1a46de0af3 feat(microservices): Fixes changes to KafkaClient 2020-11-02 18:19:45 -05:00
Michael Kaufman
453e4afd3e feat(microservices): Remove changes that were accidentally included 2020-11-02 18:16:59 -05:00
Michael Kaufman
ffd602ffd5 Merge remote-tracking branch 'Nest/master' into 2984-kafka-reply-partitions 2020-11-02 18:14:55 -05:00
Michael Kaufman
badae7b4e7 feat(microservices): Implement Kafka Nestjs reply partition assigner 2020-11-02 17:55:49 -05:00
Michael Kaufman
a85cb0fe4b feat(microservices): Kafka controller await client consumers connect 2020-10-31 17:39:00 -04:00
Andrei Alecu
cc9a912d0a fix(): add optional dependencies to package.json 2020-10-29 16:07:24 +02:00
122 changed files with 80932 additions and 212837 deletions

View File

@@ -70,3 +70,4 @@ services:
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:29092,PLAINTEXT_HOST://localhost:9092
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 0
KAFKA_DELETE_TOPIC_ENABLE: 'true'

View File

@@ -26,7 +26,7 @@ class TestInjectable
class AppModule {}
async function bootstrap() {
const app = await NestFactory.create(AppModule, { logger: true });
const app = await NestFactory.create(AppModule, { logger: false });
if (SIGNAL_TO_LISTEN && SIGNAL_TO_LISTEN !== 'NONE') {
app.enableShutdownHooks([SIGNAL_TO_LISTEN]);

View File

@@ -0,0 +1,291 @@
import { INestApplication, Logger } from '@nestjs/common';
import { Transport } from '@nestjs/microservices';
import { Test } from '@nestjs/testing';
import { Admin, ITopicMetadata, Kafka } from 'kafkajs';
import * as request from 'supertest';
import * as util from 'util';
import { KafkaConcurrentController } from '../src/kafka-concurrent/kafka-concurrent.controller';
import { KafkaConcurrentMessagesController } from '../src/kafka-concurrent/kafka-concurrent.messages.controller';
describe('Kafka concurrent', function () {
const numbersOfServers = 3;
const requestTopic = 'math.sum.sync.number.wait';
const responseTopic = 'math.sum.sync.number.wait.reply';
let admin: Admin;
const servers: any[] = [];
const apps: INestApplication[] = [];
const logger = new Logger('concurrent-kafka.spec.ts');
// set timeout to be longer (especially for the after hook)
this.timeout(30000);
const startServer = async () => {
const module = await Test.createTestingModule({
controllers: [
KafkaConcurrentController,
KafkaConcurrentMessagesController,
],
}).compile();
// use our own logger for a little
// Logger.overrideLogger(new Logger());
const app = module.createNestApplication();
const server = app.getHttpAdapter().getInstance();
app.connectMicroservice({
transport: Transport.KAFKA,
options: {
client: {
brokers: ['localhost:9092'],
},
run: {
partitionsConsumedConcurrently: numbersOfServers,
},
},
});
// enable these for clean shutdown
app.enableShutdownHooks();
// push to the collection
servers.push(server);
apps.push(app);
// await the start
await app.startAllMicroservicesAsync();
await app.init();
};
it(`Create kafka topics/partitions`, async () => {
const kafka = new Kafka({
clientId: 'concurrent-test-admin',
brokers: ['localhost:9092'],
});
admin = kafka.admin();
await admin.connect();
let topicMetadata: {
topics: ITopicMetadata[];
};
try {
topicMetadata = await admin.fetchTopicMetadata({
topics: [requestTopic, responseTopic],
});
} catch (e) {
// create with number of servers
try {
await admin.createTopics({
topics: [
{
topic: requestTopic,
numPartitions: numbersOfServers,
replicationFactor: 1,
},
{
topic: responseTopic,
numPartitions: numbersOfServers,
replicationFactor: 1,
},
],
});
} catch (e) {
logger.error(util.format('Create topics error: %o', e));
}
}
if (topicMetadata && topicMetadata.topics.length > 0) {
// we have topics, how many partitions do they have?
for (const topic of topicMetadata.topics) {
if (topic.partitions.length < numbersOfServers) {
try {
await admin.createPartitions({
topicPartitions: [
{
topic: topic.name,
count: numbersOfServers,
},
],
});
} catch (e) {
logger.error(util.format('Create partitions error: %o', e));
}
}
}
}
// create with number of servers
try {
await admin.createTopics({
topics: [
{
topic: requestTopic,
numPartitions: numbersOfServers,
replicationFactor: 1,
},
{
topic: responseTopic,
numPartitions: numbersOfServers,
replicationFactor: 1,
},
],
});
} catch (e) {
logger.error(util.format('Create topics error: %o', e));
}
// disconnect
await admin.disconnect();
});
it(`Start Kafka apps`, async () => {
// start all at once
await Promise.all(
Array(numbersOfServers)
.fill(1)
.map(async (v, i) => {
// return startServer();
// wait in intervals so the consumers start in order
return new Promise<void>(resolve => {
setTimeout(async () => {
await startServer();
return resolve();
}, 1000 * i);
});
}),
);
}).timeout(30000);
it(`Concurrent messages without forcing a rebalance.`, async () => {
// wait a second before notifying the servers to respond
setTimeout(async () => {
// notify the other servers that it is time to respond
await Promise.all(
servers.map(async server => {
// send to all servers since indexes don't necessarily align with server consumers
return request(server).post('/go').send();
}),
);
}, 1000);
await Promise.all(
servers.map(async (server, index) => {
// send requests
const payload = {
key: index,
numbers: [1, index],
};
const result = (1 + index).toString();
return request(server)
.post('/mathSumSyncNumberWait')
.send(payload)
.expect(200)
.expect(200, result);
}),
);
});
it(`Close kafka client consumer while waiting for message pattern response.`, async () => {
await Promise.all(
servers.map(async (server, index) => {
// shut off and delete the leader
if (index === 0) {
return new Promise<void>(resolve => {
// wait a second before closing so the producers can send the message to the server consumers
setTimeout(async () => {
// get the controller
const controller = apps[index].get(KafkaConcurrentController);
// close the controller clients
await controller.client.close();
// notify the other servers that we have stopped
await Promise.all(
servers.map(async server => {
// send to all servers since indexes don't necessarily align with server consumers
return request(server).post('/go').send();
}),
);
return resolve();
}, 1000);
});
}
// send requests
const payload = {
key: index,
numbers: [1, index],
};
const result = (1 + index).toString();
return request(server)
.post('/mathSumSyncNumberWait')
.send(payload)
.expect(200)
.expect(200, result);
}),
);
});
it(`Start kafka client consumer while waiting for message pattern response.`, async () => {
await Promise.all(
servers.map(async (server, index) => {
// shut off and delete the leader
if (index === 0) {
return new Promise<void>(resolve => {
// wait a second before closing so the producers can send the message to the server consumers
setTimeout(async () => {
// get the controller
const controller = apps[index].get(KafkaConcurrentController);
// connect the controller client
await controller.client.connect();
// notify the servers that we have started
await Promise.all(
servers.map(async server => {
// send to all servers since indexes don't necessarily align with server consumers
return request(server).post('/go').send();
}),
);
return resolve();
}, 1000);
});
}
// send requests
const payload = {
key: index,
numbers: [1, index],
};
const result = (1 + index).toString();
return request(server)
.post('/mathSumSyncNumberWait')
.send(payload)
.expect(200)
.expect(200, result);
}),
);
});
after(`Stopping Kafka app`, async () => {
// close all concurrently
return Promise.all(
apps.map(async app => {
return app.close();
}),
);
});
});

View File

@@ -9,10 +9,13 @@ import { UserEntity } from '../src/kafka/entities/user.entity';
import { KafkaController } from '../src/kafka/kafka.controller';
import { KafkaMessagesController } from '../src/kafka/kafka.messages.controller';
describe('Kafka transport', () => {
describe('Kafka transport', function () {
let server;
let app: INestApplication;
// set timeout to be longer (especially for the after hook)
this.timeout(30000);
it(`Start Kafka app`, async () => {
const module = await Test.createTestingModule({
controllers: [KafkaController, KafkaMessagesController],
@@ -29,6 +32,7 @@ describe('Kafka transport', () => {
},
},
});
app.enableShutdownHooks();
await app.startAllMicroservicesAsync();
await app.init();
}).timeout(30000);

View File

@@ -0,0 +1,4 @@
export class SumDto {
key: string;
numbers: number[];
}

View File

@@ -0,0 +1,70 @@
import {
Body,
Controller,
HttpCode,
OnModuleDestroy,
OnModuleInit,
Post,
} from '@nestjs/common';
import { Logger } from '@nestjs/common/services/logger.service';
import { Client, ClientKafka, Transport } from '@nestjs/microservices';
import { PartitionerArgs } from 'kafkajs';
import { Observable } from 'rxjs';
import { SumDto } from './dto/sum.dto';
/**
* The following function explicity sends messages to the key representing the partition.
*/
const explicitPartitioner = () => {
return ({ message }: PartitionerArgs) => {
return parseFloat(message.headers.toPartition.toString());
};
};
@Controller()
export class KafkaConcurrentController
implements OnModuleInit, OnModuleDestroy {
protected readonly logger = new Logger(KafkaConcurrentController.name);
@Client({
transport: Transport.KAFKA,
options: {
client: {
brokers: ['localhost:9092'],
},
run: {
partitionsConsumedConcurrently: 3,
},
producer: {
createPartitioner: explicitPartitioner,
},
},
})
public readonly client: ClientKafka;
async onModuleInit() {
const requestPatterns = ['math.sum.sync.number.wait'];
requestPatterns.forEach(pattern => {
this.client.subscribeToResponseOf(pattern);
});
await this.client.connect();
}
async onModuleDestroy() {
await this.client.close();
}
@Post('mathSumSyncNumberWait')
@HttpCode(200)
public mathSumSyncNumberWait(@Body() data: SumDto): Observable<string> {
return this.client.send('math.sum.sync.number.wait', {
headers: {
toPartition: data.key.toString(),
},
key: data.key.toString(),
value: data.numbers,
});
}
}

View File

@@ -0,0 +1,37 @@
import { Controller, HttpCode, Post } from '@nestjs/common';
import { MessagePattern } from '@nestjs/microservices';
import { BehaviorSubject, Observable } from 'rxjs';
import { first, map, skipWhile } from 'rxjs/operators';
@Controller()
export class KafkaConcurrentMessagesController {
public waiting = new BehaviorSubject<boolean>(false);
@Post('go')
@HttpCode(200)
async go() {
// no longer waiting
this.waiting.next(false);
return;
}
@MessagePattern('math.sum.sync.number.wait')
public mathSumSyncNumberWait(data: any): Observable<number> {
// start waiting
this.waiting.next(true);
// find sum
const sum = data.value[0] + data.value[1];
return this.waiting.asObservable().pipe(
skipWhile(isWaiting => {
return isWaiting;
}),
map(() => {
return sum;
}),
first(),
);
}
}

View File

@@ -1,4 +1,4 @@
import { Body, Controller, HttpCode, OnModuleInit, Post } from '@nestjs/common';
import { Body, Controller, HttpCode, OnModuleInit, Post, OnModuleDestroy } from '@nestjs/common';
import { Logger } from '@nestjs/common/services/logger.service';
import { Client, ClientKafka, Transport } from '@nestjs/microservices';
import { Observable } from 'rxjs';
@@ -6,7 +6,7 @@ import { BusinessDto } from './dtos/business.dto';
import { UserDto } from './dtos/user.dto';
@Controller()
export class KafkaController implements OnModuleInit {
export class KafkaController implements OnModuleInit, OnModuleDestroy {
protected readonly logger = new Logger(KafkaController.name);
static IS_NOTIFIED = false;
static MATH_SUM = 0;
@@ -21,7 +21,7 @@ export class KafkaController implements OnModuleInit {
})
private readonly client: ClientKafka;
onModuleInit() {
async onModuleInit() {
const requestPatterns = [
'math.sum.sync.kafka.message',
'math.sum.sync.without.key',
@@ -36,6 +36,12 @@ export class KafkaController implements OnModuleInit {
requestPatterns.forEach(pattern => {
this.client.subscribeToResponseOf(pattern);
});
await this.client.connect();
}
async onModuleDestroy() {
await this.client.close();
}
// sync send kafka message

View File

@@ -20,7 +20,7 @@ describe('ErrorGateway', () => {
ws.emit('push', {
test: 'test',
});
await new Promise(resolve =>
await new Promise<void>(resolve =>
ws.on('exception', data => {
expect(data).to.be.eql({
status: 'error',

View File

@@ -20,7 +20,7 @@ describe('WebSocketGateway (ack)', () => {
await app.listenAsync(3000);
ws = io.connect('http://localhost:8080');
await new Promise(resolve =>
await new Promise<void>(resolve =>
ws.emit('push', { test: 'test' }, data => {
expect(data).to.be.eql('pong');
resolve();
@@ -33,7 +33,7 @@ describe('WebSocketGateway (ack)', () => {
await app.listenAsync(3000);
ws = io.connect('http://localhost:8080');
await new Promise(resolve =>
await new Promise<void>(resolve =>
ws.emit('push', data => {
expect(data).to.be.eql('pong');
resolve();

View File

@@ -25,7 +25,7 @@ describe('WebSocketGateway', () => {
ws.emit('push', {
test: 'test',
});
await new Promise(resolve =>
await new Promise<void>(resolve =>
ws.on('pop', data => {
expect(data.test).to.be.eql('test');
resolve();
@@ -41,7 +41,7 @@ describe('WebSocketGateway', () => {
ws.emit('push', {
test: 'test',
});
await new Promise(resolve =>
await new Promise<void>(resolve =>
ws.on('pop', data => {
expect(data.test).to.be.eql('test');
resolve();
@@ -58,7 +58,7 @@ describe('WebSocketGateway', () => {
ws.emit('push', {
test: 'test',
});
await new Promise(resolve =>
await new Promise<void>(resolve =>
ws.on('pop', data => {
expect(data.test).to.be.eql('test');
resolve();

View File

@@ -34,7 +34,7 @@ describe('WebSocketGateway (WsAdapter)', () => {
},
}),
);
await new Promise(resolve =>
await new Promise<void>(resolve =>
ws.on('message', data => {
expect(JSON.parse(data).data.test).to.be.eql('test');
resolve();
@@ -57,7 +57,7 @@ describe('WebSocketGateway (WsAdapter)', () => {
},
}),
);
await new Promise(resolve =>
await new Promise<void>(resolve =>
ws.on('message', data => {
expect(JSON.parse(data).data.test).to.be.eql('test');
resolve();
@@ -77,7 +77,7 @@ describe('WebSocketGateway (WsAdapter)', () => {
ws = new WebSocket('ws://localhost:8080');
ws2 = new WebSocket('ws://localhost:8090');
await new Promise(resolve =>
await new Promise<void>(resolve =>
ws.on('open', () => {
ws.on('message', data => {
expect(JSON.parse(data).data.test).to.be.eql('test');
@@ -94,7 +94,7 @@ describe('WebSocketGateway (WsAdapter)', () => {
}),
);
await new Promise(resolve => {
await new Promise<void>(resolve => {
ws2.on('message', data => {
expect(JSON.parse(data).data.test).to.be.eql('test');
resolve();

View File

@@ -3,5 +3,5 @@
"packages": [
"packages/*"
],
"version": "7.5.5"
"version": "7.6.3"
}

34734
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -66,8 +66,8 @@
"reflect-metadata": "0.1.13",
"rxjs": "6.6.3",
"socket.io": "2.3.0",
"uuid": "8.3.1",
"tslib": "2.0.3"
"tslib": "2.0.3",
"uuid": "8.3.2"
},
"devDependencies": {
"@codechecks/client": "0.1.10",
@@ -75,26 +75,26 @@
"@commitlint/config-angular": "11.0.0",
"@grpc/proto-loader": "0.5.5",
"@nestjs/graphql": "7.9.1",
"@nestjs/mongoose": "7.1.2",
"@nestjs/mongoose": "7.2.0",
"@nestjs/typeorm": "7.1.5",
"@types/amqplib": "0.5.16",
"@types/bytes": "3.1.0",
"@types/cache-manager": "2.10.3",
"@types/chai": "4.2.14",
"@types/chai-as-promised": "7.1.3",
"@types/cors": "2.8.8",
"@types/cors": "2.8.9",
"@types/express": "4.17.9",
"@types/gulp": "4.0.7",
"@types/mocha": "8.0.4",
"@types/mongoose": "5.10.2",
"@types/node": "14.14.10",
"@types/mocha": "8.2.0",
"@types/mongoose": "5.10.3",
"@types/node": "14.14.14",
"@types/redis": "2.8.28",
"@types/reflect-metadata": "0.1.0",
"@types/sinon": "9.0.9",
"@types/socket.io": "2.1.12",
"@types/ws": "7.4.0",
"@typescript-eslint/eslint-plugin": "4.9.1",
"@typescript-eslint/parser": "4.9.0",
"@typescript-eslint/eslint-plugin": "4.10.0",
"@typescript-eslint/parser": "4.10.0",
"amqp-connection-manager": "3.2.1",
"amqplib": "0.6.0",
"apollo-server-express": "2.19.0",
@@ -105,21 +105,21 @@
"cache-manager": "3.4.0",
"chai": "4.2.0",
"chai-as-promised": "7.1.1",
"clang-format": "1.4.0",
"clang-format": "1.5.0",
"commitlint-circle": "1.0.0",
"concurrently": "5.3.0",
"conventional-changelog": "3.1.24",
"core-js": "3.8.1",
"coveralls": "3.1.0",
"delete-empty": "3.0.0",
"engine.io-client": "4.0.4",
"engine.io-client": "4.0.5",
"eslint": "7.15.0",
"eslint-config-prettier": "7.0.0",
"eslint-plugin-import": "2.22.1",
"eventsource": "1.0.7",
"fancy-log": "1.3.3",
"fastify": "3.9.1",
"fastify-cors": "5.0.0",
"fastify": "3.9.2",
"fastify-cors": "5.1.0",
"fastify-formbody": "5.0.0",
"fastify-multipart": "3.3.1",
"fastify-static": "3.3.0",
@@ -132,18 +132,18 @@
"gulp-sourcemaps": "3.0.0",
"gulp-typescript": "5.0.1",
"gulp-watch": "5.0.1",
"husky": "4.3.5",
"husky": "4.3.6",
"imports-loader": "1.2.0",
"json-loader": "0.5.7",
"kafkajs": "1.12.0",
"kafkajs": "1.15.0",
"lerna": "2.11.0",
"light-my-request": "4.3.0",
"light-my-request": "4.4.1",
"lint-staged": "10.5.3",
"markdown-table": "2.0.0",
"merge-graphql-schemas": "1.7.8",
"middie": "5.2.0",
"mocha": "8.2.1",
"mongoose": "5.11.5",
"mongoose": "5.11.8",
"mqtt": "4.2.6",
"multer": "1.4.2",
"mysql": "2.18.1",
@@ -154,7 +154,7 @@
"prettier": "2.2.1",
"redis": "3.0.2",
"rxjs-compat": "6.6.3",
"sinon": "9.2.1",
"sinon": "9.2.2",
"sinon-chai": "3.5.0",
"socket.io-client": "2.3.1",
"subscriptions-transport-ws": "0.9.18",
@@ -162,7 +162,7 @@
"ts-morph": "9.1.0",
"ts-node": "9.1.1",
"typeorm": "0.2.29",
"typescript": "4.0.3",
"typescript": "4.1.3",
"wrk": "1.2.1",
"ws": "7.4.1"
},

View File

@@ -1,3 +1,3 @@
export interface Type<T> extends Function {
export interface Type<T = any> extends Function {
new (...args: any[]): T;
}

View File

@@ -1,6 +1,6 @@
{
"name": "@nestjs/common",
"version": "7.5.5",
"version": "7.6.3",
"description": "Nest - modern, fast, powerful node.js web framework (@common)",
"author": "Kamil Mysliwiec",
"homepage": "https://nestjs.com",
@@ -20,10 +20,24 @@
"axios": "0.21.0",
"iterare": "1.2.1",
"tslib": "2.0.3",
"uuid": "8.3.1"
"uuid": "8.3.2"
},
"peerDependencies": {
"cache-manager": "*",
"class-transformer": "*",
"class-validator": "*",
"reflect-metadata": "^0.1.12",
"rxjs": "^6.0.0"
},
"peerDependenciesMeta": {
"cache-manager": {
"optional": true
},
"class-validator": {
"optional": true
},
"class-transformer": {
"optional": true
}
}
}

View File

@@ -199,27 +199,35 @@ export class ValidationPipe implements PipeTransform<any> {
protected mapChildrenToValidationErrors(
error: ValidationError,
parentPath?: string,
): ValidationError[] {
if (!(error.children && error.children.length)) {
return [error];
}
const validationErrors = [];
parentPath = parentPath
? `${parentPath}.${error.property}`
: error.property;
for (const item of error.children) {
if (item.children && item.children.length) {
validationErrors.push(...this.mapChildrenToValidationErrors(item));
validationErrors.push(
...this.mapChildrenToValidationErrors(item, parentPath),
);
}
validationErrors.push(this.prependConstraintsWithParentProp(error, item));
validationErrors.push(
this.prependConstraintsWithParentProp(parentPath, item),
);
}
return validationErrors;
}
protected prependConstraintsWithParentProp(
parentError: ValidationError,
parentPath: string,
error: ValidationError,
): ValidationError {
const constraints = {};
for (const key in error.constraints) {
constraints[key] = `${parentError.property}.${error.constraints[key]}`;
constraints[key] = `${parentPath}.${error.constraints[key]}`;
}
return {
...error,

View File

@@ -152,7 +152,7 @@ export class Logger implements LoggerService {
const pidMessage = color(`[Nest] ${process.pid} - `);
const contextMessage = context ? yellow(`[${context}] `) : '';
const timestampDiff = this.updateAndGetTimestampDiff(isTimeDiffEnabled);
const instance = this.instance as typeof Logger;
const instance = (this.instance as typeof Logger) ?? Logger;
process.stdout.write(
`${pidMessage}${instance.getTimestamp()} ${contextMessage}${output}${timestampDiff}\n`,
);

View File

@@ -8,6 +8,7 @@ import {
IsOptional,
IsString,
ValidateNested,
IsArray,
} from 'class-validator';
import { HttpStatus } from '../../enums';
import { UnprocessableEntityException } from '../../exceptions';
@@ -156,6 +157,32 @@ describe('ValidationPipe', () => {
]);
}
});
class TestModelForNestedArrayValidation {
@IsString()
public prop: string;
@IsArray()
@ValidateNested()
@Type(() => TestModel2)
public test: TestModel2[];
}
it('should provide complete path for nested errors', async () => {
try {
const model = new TestModelForNestedArrayValidation();
model.test = [new TestModel2()];
await target.transform(model, {
type: 'body',
metatype: TestModelForNestedArrayValidation,
});
} catch (err) {
expect(err.getResponse().message).to.be.eql([
'prop must be a string',
'test.0.prop1 must be a string',
'test.0.prop2 must be a boolean value',
]);
}
});
});
describe('when validation transforms', () => {
it('should return a TestModel instance', async () => {

View File

@@ -27,6 +27,12 @@ export const isPlainObject = (fn: any): fn is object => {
export const addLeadingSlash = (path?: string): string =>
path ? (path.charAt(0) !== '/' ? '/' + path : path) : '';
/**
* Deprecated. Use the "addLeadingSlash" function instead.
* @deprecated
*/
export const validatePath = addLeadingSlash;
export const isFunction = (fn: any): boolean => typeof fn === 'function';
export const isString = (fn: any): fn is string => typeof fn === 'string';
export const isConstructor = (fn: any): boolean => fn === 'constructor';

View File

@@ -69,82 +69,6 @@ export interface InjectorDependencyContext {
}
export class Injector {
public async loadMiddleware(
wrapper: InstanceWrapper,
collection: Map<string, InstanceWrapper>,
moduleRef: Module,
contextId = STATIC_CONTEXT,
inquirer?: InstanceWrapper,
) {
const { metatype } = wrapper;
const targetWrapper = collection.get(metatype.name);
if (!isUndefined(targetWrapper.instance)) {
return;
}
const loadInstance = (instances: any[]) => {
targetWrapper.instance = targetWrapper.isDependencyTreeStatic()
? new (metatype as Type<any>)(...instances)
: Object.create(metatype.prototype);
};
await this.resolveConstructorParams(
wrapper,
moduleRef,
null,
loadInstance,
contextId,
inquirer,
);
}
public async loadController(
wrapper: InstanceWrapper<Controller>,
moduleRef: Module,
contextId = STATIC_CONTEXT,
) {
const controllers = moduleRef.controllers;
await this.loadInstance<Controller>(
wrapper,
controllers,
moduleRef,
contextId,
wrapper,
);
await this.loadEnhancersPerContext(wrapper, contextId, wrapper);
}
public async loadInjectable<T = any>(
wrapper: InstanceWrapper<T>,
moduleRef: Module,
contextId = STATIC_CONTEXT,
inquirer?: InstanceWrapper,
) {
const injectables = moduleRef.injectables;
await this.loadInstance<T>(
wrapper,
injectables,
moduleRef,
contextId,
inquirer,
);
}
public async loadProvider(
wrapper: InstanceWrapper<Injectable>,
moduleRef: Module,
contextId = STATIC_CONTEXT,
inquirer?: InstanceWrapper,
) {
const providers = moduleRef.providers;
await this.loadInstance<Injectable>(
wrapper,
providers,
moduleRef,
contextId,
inquirer,
);
await this.loadEnhancersPerContext(wrapper, contextId, wrapper);
}
public loadPrototype<T>(
{ name }: InstanceWrapper<T>,
collection: Map<string, InstanceWrapper<T>>,
@@ -164,15 +88,6 @@ export class Injector {
}
}
public applyDoneHook<T>(wrapper: InstancePerContext<T>): () => void {
let done: () => void;
wrapper.donePromise = new Promise<void>((resolve, reject) => {
done = resolve;
});
wrapper.isPending = true;
return done;
}
public async loadInstance<T>(
wrapper: InstanceWrapper<T>,
collection: Map<string, InstanceWrapper>,
@@ -225,11 +140,91 @@ export class Injector {
);
}
public async loadMiddleware(
wrapper: InstanceWrapper,
collection: Map<string, InstanceWrapper>,
moduleRef: Module,
contextId = STATIC_CONTEXT,
inquirer?: InstanceWrapper,
) {
const { metatype } = wrapper;
const targetWrapper = collection.get(metatype.name);
if (!isUndefined(targetWrapper.instance)) {
return;
}
targetWrapper.instance = Object.create(metatype.prototype);
await this.loadInstance(
wrapper,
collection,
moduleRef,
contextId,
inquirer || wrapper,
);
}
public async loadController(
wrapper: InstanceWrapper<Controller>,
moduleRef: Module,
contextId = STATIC_CONTEXT,
) {
const controllers = moduleRef.controllers;
await this.loadInstance<Controller>(
wrapper,
controllers,
moduleRef,
contextId,
wrapper,
);
await this.loadEnhancersPerContext(wrapper, contextId, wrapper);
}
public async loadInjectable<T = any>(
wrapper: InstanceWrapper<T>,
moduleRef: Module,
contextId = STATIC_CONTEXT,
inquirer?: InstanceWrapper,
) {
const injectables = moduleRef.injectables;
await this.loadInstance<T>(
wrapper,
injectables,
moduleRef,
contextId,
inquirer,
);
}
public async loadProvider(
wrapper: InstanceWrapper<Injectable>,
moduleRef: Module,
contextId = STATIC_CONTEXT,
inquirer?: InstanceWrapper,
) {
const providers = moduleRef.providers;
await this.loadInstance<Injectable>(
wrapper,
providers,
moduleRef,
contextId,
inquirer,
);
await this.loadEnhancersPerContext(wrapper, contextId, wrapper);
}
public applyDoneHook<T>(wrapper: InstancePerContext<T>): () => void {
let done: () => void;
wrapper.donePromise = new Promise<void>((resolve, reject) => {
done = resolve;
});
wrapper.isPending = true;
return done;
}
public async resolveConstructorParams<T>(
wrapper: InstanceWrapper<T>,
moduleRef: Module,
inject: InjectorDependency[],
callback: (args: unknown[]) => void,
callback: (args: unknown[]) => void | Promise<void>,
contextId = STATIC_CONTEXT,
inquirer?: InstanceWrapper,
parentInquirer?: InstanceWrapper,
@@ -274,7 +269,7 @@ export class Injector {
if (!instanceHost.isResolved && !paramWrapper.forwardRef) {
isResolved = false;
}
return instanceHost && instanceHost.instance;
return instanceHost?.instance;
} catch (err) {
const isOptional = optionalDependenciesIds.includes(index);
if (!isOptional) {
@@ -372,7 +367,7 @@ export class Injector {
public async resolveComponentHost<T>(
moduleRef: Module,
instanceWrapper: InstanceWrapper<T>,
instanceWrapper: InstanceWrapper<T | Promise<T>>,
contextId = STATIC_CONTEXT,
inquirer?: InstanceWrapper,
): Promise<InstanceWrapper> {

View File

@@ -47,11 +47,15 @@ export class ModuleTokenFactory {
private replacer(key: string, value: any) {
if (typeof value === 'function') {
const isClass = /^class\s/.test(Function.prototype.toString.call(value));
const funcAsString = value.toString();
const isClass = /^class\s/.test(funcAsString);
if (isClass) {
return value.name;
}
return hash(value.toString(), { ignoreUnknown: true });
return hash(funcAsString, { ignoreUnknown: true });
}
if (typeof value === 'symbol') {
return value.toString();
}
return value;
}

View File

@@ -168,6 +168,9 @@ export class MiddlewareModule {
if (isUndefined(instanceWrapper)) {
throw new RuntimeException();
}
if (instanceWrapper.isTransient) {
return;
}
await this.bindHandler(
instanceWrapper,
applicationRef,

View File

@@ -348,7 +348,7 @@ export class NestApplication
}
private listenToPromise(microservice: INestMicroservice) {
return new Promise(async resolve => {
return new Promise<void>(async resolve => {
await microservice.listen(resolve);
});
}

View File

@@ -215,10 +215,10 @@ export class NestFactoryStatic {
}
private applyLogger(options: NestApplicationContextOptions | undefined) {
if (!options) {
if (!options || options?.logger === true || isNil(options?.logger)) {
return;
}
!isNil(options.logger) && Logger.overrideLogger(options.logger);
Logger.overrideLogger(options.logger);
}
private createHttpAdapter<T = any>(httpServer?: T): AbstractHttpAdapter {

View File

@@ -1,6 +1,6 @@
{
"name": "@nestjs/core",
"version": "7.5.5",
"version": "7.6.3",
"description": "Nest - modern, fast, powerful node.js web framework (@core)",
"author": "Kamil Mysliwiec",
"license": "MIT",
@@ -33,14 +33,28 @@
"object-hash": "2.0.3",
"path-to-regexp": "3.2.0",
"tslib": "2.0.3",
"uuid": "8.3.1"
"uuid": "8.3.2"
},
"devDependencies": {
"@nestjs/common": "7.5.5"
"@nestjs/common": "7.6.3"
},
"peerDependencies": {
"@nestjs/common": "^7.0.0",
"@nestjs/microservices": "^7.0.0",
"@nestjs/platform-express": "^7.0.0",
"@nestjs/websockets": "^7.0.0",
"reflect-metadata": "^0.1.12",
"rxjs": "^6.0.0"
},
"peerDependenciesMeta": {
"@nestjs/websockets": {
"optional": true
},
"@nestjs/microservices": {
"optional": true
},
"@nestjs/platform-express": {
"optional": true
}
}
}

View File

@@ -189,28 +189,28 @@ describe('Injector', () => {
});
describe('loadMiddleware', () => {
let resolveConstructorParams: sinon.SinonSpy;
let loadInstanceSpy: sinon.SinonSpy;
beforeEach(() => {
resolveConstructorParams = sinon.spy();
injector.resolveConstructorParams = resolveConstructorParams;
loadInstanceSpy = sinon.spy();
injector.loadInstance = loadInstanceSpy;
});
it('should call "resolveConstructorParams" when instance is not resolved', () => {
it('should call "loadInstance" when instance is not resolved', () => {
const collection = {
get: (...args) => ({}),
set: (...args) => {},
};
injector.loadMiddleware(
{ metatype: { name: '' } } as any,
{ metatype: { name: '', prototype: {} } } as any,
collection as any,
null,
);
expect(resolveConstructorParams.called).to.be.true;
expect(loadInstanceSpy.called).to.be.true;
});
it('should not call "resolveConstructorParams" when instance is not resolved', () => {
it('should not call "loadInstanceSpy" when instance is not resolved', () => {
const collection = {
get: (...args) => ({
instance: {},
@@ -223,7 +223,7 @@ describe('Injector', () => {
collection as any,
null,
);
expect(resolveConstructorParams.called).to.be.false;
expect(loadInstanceSpy.called).to.be.false;
});
});

View File

@@ -30,6 +30,7 @@ describe('ModuleTokenFactory', () => {
const token = factory.create(type, {
providers: [{}],
} as any);
expect(token).to.be.deep.eq(
hash({
id: moduleId,
@@ -62,6 +63,24 @@ describe('ModuleTokenFactory', () => {
'{"providers":["Provider"],"exports":["Provider"]}',
);
});
it('should serialize symbols in a dynamic metadata object', () => {
const metadata = {
providers: [
{
provide: Symbol('a'),
useValue: 'a',
},
{
provide: Symbol('b'),
useValue: 'b',
},
],
};
expect(factory.getDynamicMetadataToken(metadata)).to.be.eql(
'{"providers":[{"provide":"Symbol(a)","useValue":"a"},{"provide":"Symbol(b)","useValue":"b"}]}',
);
});
});
describe('when metadata does not exist', () => {
it('should return empty string', () => {

View File

@@ -125,7 +125,7 @@ data: hello
'Content-Type': 'text/event-stream',
Connection: 'keep-alive',
'Cache-Control':
'private, no-cache, no-store, must-revalidate, max-age=0',
'private, no-cache, no-store, must-revalidate, max-age=0, no-transform',
'Transfer-Encoding': 'identity',
Pragma: 'no-cache',
Expire: '0',

View File

@@ -8,7 +8,6 @@ import {
} from '../constants';
import { KafkaResponseDeserializer } from '../deserializers/kafka-response.deserializer';
import { KafkaHeaders } from '../enums';
import { InvalidKafkaClientTopicPartitionException } from '../errors/invalid-kafka-client-topic-partition.exception';
import { InvalidKafkaClientTopicException } from '../errors/invalid-kafka-client-topic.exception';
import {
BrokersFunction,
@@ -24,7 +23,7 @@ import {
import {
KafkaLogger,
KafkaParser,
KafkaRoundRobinPartitionAssigner,
KafkaReplyPartitionAssigner,
} from '../helpers';
import {
KafkaOptions,
@@ -46,7 +45,7 @@ export class ClientKafka extends ClientProxy {
protected producer: Producer = null;
protected logger = new Logger(ClientKafka.name);
protected responsePatterns: string[] = [];
protected consumerAssignments: { [key: string]: number[] } = {};
protected consumerAssignments: { [key: string]: number } = {};
protected brokers: string[] | BrokersFunction;
protected clientId: string;
@@ -59,17 +58,16 @@ export class ClientKafka extends ClientProxy {
this.getOptionsProp(this.options, 'client') || ({} as KafkaConfig);
const consumerOptions =
this.getOptionsProp(this.options, 'consumer') || ({} as ConsumerConfig);
const postfixId =
this.getOptionsProp(this.options, 'postfixId') || '-client';
this.brokers = clientOptions.brokers || [KAFKA_DEFAULT_BROKER];
// Append a unique id to the clientId and groupId
// so they don't collide with a microservices client
this.clientId =
(clientOptions.clientId || KAFKA_DEFAULT_CLIENT) +
(clientOptions.clientIdPostfix || '-client');
this.groupId =
(consumerOptions.groupId || KAFKA_DEFAULT_GROUP) +
(clientOptions.clientIdPostfix || '-client');
(clientOptions.clientId || KAFKA_DEFAULT_CLIENT) + postfixId;
this.groupId = (consumerOptions.groupId || KAFKA_DEFAULT_GROUP) + postfixId;
kafkaPackage = loadPackage('kafkajs', ClientKafka.name, () =>
require('kafkajs'),
@@ -99,11 +97,8 @@ export class ClientKafka extends ClientProxy {
this.client = this.createClient();
const partitionAssigners = [
(
config: ConstructorParameters<
typeof KafkaRoundRobinPartitionAssigner
>[0],
) => new KafkaRoundRobinPartitionAssigner(config),
(config: ConstructorParameters<typeof KafkaReplyPartitionAssigner>[1]) =>
new KafkaReplyPartitionAssigner(this, config),
] as any[];
const consumerOptions = Object.assign(
@@ -188,6 +183,10 @@ export class ClientKafka extends ClientProxy {
};
}
public getConsumerAssignments() {
return this.consumerAssignments;
}
protected dispatchEvent(packet: OutgoingEvent): Promise<any> {
const pattern = this.normalizePattern(packet.pattern);
const outgoingEvent = this.serializer.serialize(packet.data);
@@ -202,17 +201,13 @@ export class ClientKafka extends ClientProxy {
}
protected getReplyTopicPartition(topic: string): string {
const topicAssignments = this.consumerAssignments[topic];
if (isUndefined(topicAssignments)) {
const minimumPartition = this.consumerAssignments[topic];
if (isUndefined(minimumPartition)) {
throw new InvalidKafkaClientTopicException(topic);
}
// if the current member isn't listening to
// any partitions on the topic then throw an error.
if (isUndefined(topicAssignments[0])) {
throw new InvalidKafkaClientTopicPartitionException(topic);
}
return topicAssignments[0].toString();
// get the minimum partition
return minimumPartition.toString();
}
protected publish(
@@ -241,7 +236,7 @@ export class ClientKafka extends ClientProxy {
},
this.options.send || {},
);
this.producer.send(message);
this.producer.send(message).catch(err => callback({ err }));
return () => this.routingMap.delete(packet.id);
} catch (err) {
@@ -254,7 +249,18 @@ export class ClientKafka extends ClientProxy {
}
protected setConsumerAssignments(data: ConsumerGroupJoinEvent): void {
this.consumerAssignments = data.payload.memberAssignment;
const consumerAssignments: { [key: string]: number } = {};
// only need to set the minimum
Object.keys(data.payload.memberAssignment).forEach(memberId => {
const minimumPartition = Math.min(
...data.payload.memberAssignment[memberId],
);
consumerAssignments[memberId] = minimumPartition;
});
this.consumerAssignments = consumerAssignments;
}
protected initializeSerializer(options: KafkaOptions['options']) {

View File

@@ -157,7 +157,7 @@ export class ClientMqtt extends ClientProxy {
const pattern = this.normalizePattern(packet.pattern);
const serializedPacket = this.serializer.serialize(packet);
return new Promise((resolve, reject) =>
return new Promise<void>((resolve, reject) =>
this.mqttClient.publish(pattern, JSON.stringify(serializedPacket), err =>
err ? reject(err) : resolve(),
),

View File

@@ -111,7 +111,7 @@ export class ClientNats extends ClientProxy {
const pattern = this.normalizePattern(packet.pattern);
const serializedPacket = this.serializer.serialize(packet);
return new Promise((resolve, reject) =>
return new Promise<void>((resolve, reject) =>
this.natsClient.publish(pattern, serializedPacket as any, err =>
err ? reject(err) : resolve(),
),

View File

@@ -190,7 +190,7 @@ export class ClientRedis extends ClientProxy {
const pattern = this.normalizePattern(packet.pattern);
const serializedPacket = this.serializer.serialize(packet);
return new Promise((resolve, reject) =>
return new Promise<void>((resolve, reject) =>
this.pubClient.publish(pattern, JSON.stringify(serializedPacket), err =>
err ? reject(err) : resolve(),
),

View File

@@ -16,9 +16,9 @@ import {
RQM_DEFAULT_QUEUE_OPTIONS,
RQM_DEFAULT_URL,
} from '../constants';
import { RmqUrl } from '../external/rmq-url.interface';
import { ReadPacket, RmqOptions, WritePacket } from '../interfaces';
import { ClientProxy } from './client-proxy';
import { RmqUrl } from '../external/rmq-url.interface';
let rqmPackage: any = {};
@@ -204,14 +204,14 @@ export class ClientRMQ extends ClientProxy {
protected dispatchEvent(packet: ReadPacket): Promise<any> {
const serializedPacket = this.serializer.serialize(packet);
return new Promise((resolve, reject) =>
return new Promise<void>((resolve, reject) =>
this.channel.sendToQueue(
this.queue,
Buffer.from(JSON.stringify(serializedPacket)),
{
persistent: this.persistent,
},
err => (err ? reject(err) : resolve()),
(err: unknown) => (err ? reject(err) : resolve()),
),
);
}

View File

@@ -1,9 +0,0 @@
import { RuntimeException } from '@nestjs/core/errors/exceptions/runtime.exception';
export class InvalidKafkaClientTopicPartitionException extends RuntimeException {
constructor(topic?: string) {
super(
`The client consumer subscribed to the topic (${topic}) which is not assigned to any partitions.`,
);
}
}

View File

@@ -1,931 +0,0 @@
/// <reference types="node" />
import * as net from 'net';
import * as tls from 'tls';
type Without<T, U> = { [P in Exclude<keyof T, keyof U>]?: never };
type XOR<T, U> = T | U extends object
? (Without<T, U> & U) | (Without<U, T> & T)
: T | U;
export declare class Kafka {
constructor(config: KafkaConfig);
producer(config?: ProducerConfig): Producer;
consumer(config?: ConsumerConfig): Consumer;
admin(config?: AdminConfig): Admin;
logger(): Logger;
}
export type BrokersFunction = () => string[] | Promise<string[]>;
export interface KafkaConfig {
brokers: string[] | BrokersFunction;
ssl?: tls.ConnectionOptions | boolean;
sasl?: SASLOptions;
clientId?: string;
clientIdPostfix?: string;
connectionTimeout?: number;
authenticationTimeout?: number;
reauthenticationThreshold?: number;
requestTimeout?: number;
enforceRequestTimeout?: boolean;
retry?: RetryOptions;
socketFactory?: ISocketFactory;
logLevel?: logLevel;
logCreator?: logCreator;
}
export type ISocketFactory = (
host: string,
port: number,
ssl: tls.ConnectionOptions,
onConnect: () => void,
) => net.Socket;
export type SASLMechanism = 'plain' | 'scram-sha-256' | 'scram-sha-512' | 'aws';
export interface SASLOptions {
mechanism: SASLMechanism;
username: string;
password: string;
}
export interface ProducerConfig {
createPartitioner?: ICustomPartitioner;
retry?: RetryOptions;
metadataMaxAge?: number;
allowAutoTopicCreation?: boolean;
idempotent?: boolean;
transactionalId?: string;
transactionTimeout?: number;
maxInFlightRequests?: number;
}
export interface Message {
key?: Buffer | string | null;
value: Buffer | string | null;
partition?: number;
headers?: IHeaders;
timestamp?: string;
}
export interface PartitionerArgs {
topic: string;
partitionMetadata: PartitionMetadata[];
message: Message;
}
export type ICustomPartitioner = () => (args: PartitionerArgs) => number;
export type DefaultPartitioner = ICustomPartitioner;
export type JavaCompatiblePartitioner = ICustomPartitioner;
export let Partitioners: {
DefaultPartitioner: DefaultPartitioner;
JavaCompatiblePartitioner: JavaCompatiblePartitioner;
};
export type PartitionMetadata = {
partitionErrorCode: number;
partitionId: number;
leader: number;
replicas: number[];
isr: number[];
offlineReplicas?: number[];
};
export interface IHeaders {
[key: string]: Buffer | string;
}
export interface ConsumerConfig {
groupId: string;
partitionAssigners?: PartitionAssigner[];
metadataMaxAge?: number;
sessionTimeout?: number;
rebalanceTimeout?: number;
heartbeatInterval?: number;
maxBytesPerPartition?: number;
minBytes?: number;
maxBytes?: number;
maxWaitTimeInMs?: number;
retry?: RetryOptions & {
restartOnFailure?: (err: Error) => Promise<boolean>;
};
allowAutoTopicCreation?: boolean;
maxInFlightRequests?: number;
readUncommitted?: boolean;
rackId?: string;
}
export type PartitionAssigner = (config: { cluster: Cluster }) => Assigner;
export interface CoordinatorMetadata {
errorCode: number;
coordinator: {
nodeId: number;
host: string;
port: number;
};
}
export type Cluster = {
isConnected(): boolean;
connect(): Promise<void>;
disconnect(): Promise<void>;
refreshMetadata(): Promise<void>;
refreshMetadataIfNecessary(): Promise<void>;
addTargetTopic(topic: string): Promise<void>;
findBroker(node: { nodeId: string }): Promise<Broker>;
findControllerBroker(): Promise<Broker>;
findTopicPartitionMetadata(topic: string): PartitionMetadata[];
findLeaderForPartitions(
topic: string,
partitions: number[],
): { [leader: string]: number[] };
findGroupCoordinator(group: { groupId: string }): Promise<Broker>;
findGroupCoordinatorMetadata(group: {
groupId: string;
}): Promise<CoordinatorMetadata>;
defaultOffset(config: { fromBeginning: boolean }): number;
fetchTopicsOffset(
topics: Array<
{
topic: string;
partitions: Array<{ partition: number }>;
} & XOR<{ fromBeginning: boolean }, { fromTimestamp: number }>
>,
): Promise<{
topic: string;
partitions: Array<{ partition: number; offset: string }>;
}>;
};
export type Assignment = { [topic: string]: number[] };
export type GroupMember = { memberId: string; memberMetadata: Buffer };
export type GroupMemberAssignment = {
memberId: string;
memberAssignment: Buffer;
};
export type GroupState = { name: string; metadata: Buffer };
export type Assigner = {
name: string;
version: number;
assign(group: {
members: GroupMember[];
topics: string[];
}): Promise<GroupMemberAssignment[]>;
protocol(subscription: { topics: string[] }): GroupState;
};
export interface RetryOptions {
maxRetryTime?: number;
initialRetryTime?: number;
factor?: number;
multiplier?: number;
retries?: number;
}
export interface AdminConfig {
retry?: RetryOptions;
}
export interface ITopicConfig {
topic: string;
numPartitions?: number;
replicationFactor?: number;
replicaAssignment?: object[];
configEntries?: object[];
}
export interface ITopicPartitionConfig {
topic: string;
count: number;
assignments?: Array<Array<number>>;
}
export interface ITopicMetadata {
name: string;
partitions: PartitionMetadata[];
}
export enum ResourceTypes {
UNKNOWN = 0,
ANY = 1,
TOPIC = 2,
GROUP = 3,
CLUSTER = 4,
TRANSACTIONAL_ID = 5,
DELEGATION_TOKEN = 6,
}
export interface ResourceConfigQuery {
type: ResourceTypes;
name: string;
configNames?: string[];
}
export interface ConfigEntries {
configName: string;
configValue: string;
isDefault: boolean;
isSensitive: boolean;
readOnly: boolean;
configSynonyms: ConfigSynonyms[];
}
export interface ConfigSynonyms {
configName: string;
configValue: string;
configSource: number;
}
export interface DescribeConfigResponse {
resources: {
configEntries: ConfigEntries[];
errorCode: number;
errorMessage: string;
resourceName: string;
resourceType: ResourceTypes;
}[];
throttleTime: number;
}
export interface IResourceConfig {
type: ResourceTypes;
name: string;
configEntries: { name: string; value: string }[];
}
type ValueOf<T> = T[keyof T];
export type AdminEvents = {
CONNECT: 'admin.connect';
DISCONNECT: 'admin.disconnect';
REQUEST: 'admin.network.request';
REQUEST_TIMEOUT: 'admin.network.request_timeout';
REQUEST_QUEUE_SIZE: 'admin.network.request_queue_size';
};
export interface InstrumentationEvent<T> {
id: string;
type: string;
timestamp: number;
payload: T;
}
export type RemoveInstrumentationEventListener<T> = () => void;
export type ConnectEvent = InstrumentationEvent<null>;
export type DisconnectEvent = InstrumentationEvent<null>;
export type RequestEvent = InstrumentationEvent<{
apiKey: number;
apiName: string;
apiVersion: number;
broker: string;
clientId: string;
correlationId: number;
createdAt: number;
duration: number;
pendingDuration: number;
sentAt: number;
size: number;
}>;
export type RequestTimeoutEvent = InstrumentationEvent<{
apiKey: number;
apiName: string;
apiVersion: number;
broker: string;
clientId: string;
correlationId: number;
createdAt: number;
pendingDuration: number;
sentAt: number;
}>;
export type RequestQueueSizeEvent = InstrumentationEvent<{
broker: string;
clientId: string;
queueSize: number;
}>;
export interface SeekEntry {
partition: number;
offset: string;
}
export type Admin = {
connect(): Promise<void>;
disconnect(): Promise<void>;
listTopics(): Promise<string[]>;
createTopics(options: {
validateOnly?: boolean;
waitForLeaders?: boolean;
timeout?: number;
topics: ITopicConfig[];
}): Promise<boolean>;
deleteTopics(options: { topics: string[]; timeout?: number }): Promise<void>;
createPartitions(options: {
validateOnly?: boolean;
timeout?: number;
topicPartitions: ITopicPartitionConfig[];
}): Promise<boolean>;
fetchTopicMetadata(options?: {
topics: string[];
}): Promise<{ topics: Array<ITopicMetadata> }>;
fetchOffsets(options: {
groupId: string;
topic: string;
}): Promise<Array<SeekEntry & { metadata: string | null }>>;
fetchTopicOffsets(
topic: string,
): Promise<Array<SeekEntry & { high: string; low: string }>>;
fetchTopicOffsetsByTimestamp(
topic: string,
timestamp?: number,
): Promise<Array<SeekEntry>>;
describeCluster(): Promise<{
brokers: Array<{ nodeId: number; host: string; port: number }>;
controller: number | null;
clusterId: string;
}>;
setOffsets(options: {
groupId: string;
topic: string;
partitions: SeekEntry[];
}): Promise<void>;
resetOffsets(options: {
groupId: string;
topic: string;
earliest: boolean;
}): Promise<void>;
describeConfigs(configs: {
resources: ResourceConfigQuery[];
includeSynonyms: boolean;
}): Promise<DescribeConfigResponse>;
alterConfigs(configs: {
validateOnly: boolean;
resources: IResourceConfig[];
}): Promise<any>;
listGroups(): Promise<{ groups: GroupOverview[] }>;
deleteGroups(groupIds: string[]): Promise<DeleteGroupsResult[]>;
describeGroups(groupIds: string[]): Promise<GroupDescriptions>;
logger(): Logger;
on(
eventName: ValueOf<AdminEvents>,
listener: (...args: any[]) => void,
): RemoveInstrumentationEventListener<typeof eventName>;
events: AdminEvents;
};
export let PartitionAssigners: { roundRobin: PartitionAssigner };
export interface ISerializer<T> {
encode(value: T): Buffer;
decode(buffer: Buffer): T | null;
}
export type MemberMetadata = {
version: number;
topics: string[];
userData: Buffer;
};
export type MemberAssignment = {
version: number;
assignment: Assignment;
userData: Buffer;
};
export let AssignerProtocol: {
MemberMetadata: ISerializer<MemberMetadata>;
MemberAssignment: ISerializer<MemberAssignment>;
};
export enum logLevel {
NOTHING = 0,
ERROR = 1,
WARN = 2,
INFO = 4,
DEBUG = 5,
}
export interface LogEntry {
namespace: string;
level: logLevel;
label: string;
log: LoggerEntryContent;
}
export interface LoggerEntryContent {
readonly timestamp: Date;
readonly message: string;
[key: string]: any;
}
export type logCreator = (logLevel: logLevel) => (entry: LogEntry) => void;
export type Logger = {
info: (message: string, extra?: object) => void;
error: (message: string, extra?: object) => void;
warn: (message: string, extra?: object) => void;
debug: (message: string, extra?: object) => void;
};
export type Broker = {
isConnected(): boolean;
connect(): Promise<void>;
disconnect(): Promise<void>;
apiVersions(): Promise<{
[apiKey: number]: { minVersion: number; maxVersion: number };
}>;
metadata(
topics: string[],
): Promise<{
brokers: Array<{
nodeId: number;
host: string;
port: number;
rack?: string;
}>;
topicMetadata: Array<{
topicErrorCode: number;
topic: number;
partitionMetadata: PartitionMetadata[];
}>;
}>;
offsetCommit(request: {
groupId: string;
groupGenerationId: number;
memberId: string;
retentionTime?: number;
topics: Array<{
topic: string;
partitions: Array<{ partition: number; offset: string }>;
}>;
}): Promise<any>;
fetch(request: {
replicaId?: number;
isolationLevel?: number;
maxWaitTime?: number;
minBytes?: number;
maxBytes?: number;
topics: Array<{
topic: string;
partitions: Array<{
partition: number;
fetchOffset: string;
maxBytes: number;
}>;
}>;
rackId?: string;
}): Promise<any>;
};
export type KafkaMessage = {
key: Buffer;
value: Buffer | null;
timestamp: string;
size: number;
attributes: number;
offset: string;
headers?: IHeaders;
};
export interface ProducerRecord {
topic: string;
messages: Message[];
acks?: number;
timeout?: number;
compression?: CompressionTypes;
}
export type RecordMetadata = {
topicName: string;
partition: number;
errorCode: number;
offset: string;
timestamp: string;
};
export interface TopicMessages {
topic: string;
messages: Message[];
}
export interface ProducerBatch {
acks?: number;
timeout?: number;
compression?: CompressionTypes;
topicMessages?: TopicMessages[];
}
export interface PartitionOffset {
partition: number;
offset: string;
}
export interface TopicOffsets {
topic: string;
partitions: PartitionOffset[];
}
export interface Offsets {
topics: TopicOffsets[];
}
type Sender = {
send(record: ProducerRecord): Promise<RecordMetadata[]>;
sendBatch(batch: ProducerBatch): Promise<RecordMetadata[]>;
};
export type ProducerEvents = {
CONNECT: 'producer.connect';
DISCONNECT: 'producer.disconnect';
REQUEST: 'producer.network.request';
REQUEST_TIMEOUT: 'producer.network.request_timeout';
REQUEST_QUEUE_SIZE: 'producer.network.request_queue_size';
};
export type Producer = Sender & {
connect(): Promise<void>;
disconnect(): Promise<void>;
isIdempotent(): boolean;
events: ProducerEvents;
on(
eventName: ValueOf<ProducerEvents>,
listener: (...args: any[]) => void,
): RemoveInstrumentationEventListener<typeof eventName>;
transaction(): Promise<Transaction>;
logger(): Logger;
};
export type Transaction = Sender & {
sendOffsets(offsets: Offsets & { consumerGroupId: string }): Promise<void>;
commit(): Promise<void>;
abort(): Promise<void>;
isActive(): boolean;
};
export type ConsumerGroup = {
groupId: string;
generationId: number;
memberId: string;
coordinator: Broker;
};
export type MemberDescription = {
clientHost: string;
clientId: string;
memberId: string;
memberAssignment: Buffer;
memberMetadata: Buffer;
};
export type GroupDescription = {
groupId: string;
members: MemberDescription[];
protocol: string;
protocolType: string;
state: string;
};
export type GroupDescriptions = {
groups: GroupDescription[];
};
export type TopicPartitions = { topic: string; partitions: number[] };
export type TopicPartitionOffsetAndMetadata = {
topic: string;
partition: number;
offset: string;
metadata?: string | null;
};
// TODO: Remove with 2.x
export type TopicPartitionOffsetAndMedata = TopicPartitionOffsetAndMetadata;
export type Batch = {
topic: string;
partition: number;
highWatermark: string;
messages: KafkaMessage[];
isEmpty(): boolean;
firstOffset(): string | null;
lastOffset(): string;
offsetLag(): string;
offsetLagLow(): string;
};
export type GroupOverview = {
groupId: string;
protocolType: string;
};
export type DeleteGroupsResult = {
groupId: string;
errorCode?: number;
};
export type ConsumerEvents = {
HEARTBEAT: 'consumer.heartbeat';
COMMIT_OFFSETS: 'consumer.commit_offsets';
GROUP_JOIN: 'consumer.group_join';
FETCH_START: 'consumer.fetch_start';
FETCH: 'consumer.fetch';
START_BATCH_PROCESS: 'consumer.start_batch_process';
END_BATCH_PROCESS: 'consumer.end_batch_process';
CONNECT: 'consumer.connect';
DISCONNECT: 'consumer.disconnect';
STOP: 'consumer.stop';
CRASH: 'consumer.crash';
REQUEST: 'consumer.network.request';
REQUEST_TIMEOUT: 'consumer.network.request_timeout';
REQUEST_QUEUE_SIZE: 'consumer.network.request_queue_size';
};
export type ConsumerHeartbeatEvent = InstrumentationEvent<{
groupId: string;
memberId: string;
groupGenerationId: number;
}>;
export type ConsumerCommitOffsetsEvent = InstrumentationEvent<{
groupId: string;
memberId: string;
groupGenerationId: number;
topics: {
topic: string;
partitions: {
offset: string;
partition: string;
}[];
}[];
}>;
export interface IMemberAssignment {
[key: string]: number[];
}
export type ConsumerGroupJoinEvent = InstrumentationEvent<{
duration: number;
groupId: string;
isLeader: boolean;
leaderId: string;
groupProtocol: string;
memberId: string;
memberAssignment: IMemberAssignment;
}>;
export type ConsumerFetchEvent = InstrumentationEvent<{
numberOfBatches: number;
duration: number;
}>;
interface IBatchProcessEvent {
topic: string;
partition: number;
highWatermark: string;
offsetLag: string;
offsetLagLow: string;
batchSize: number;
firstOffset: string;
lastOffset: string;
}
export type ConsumerStartBatchProcessEvent = InstrumentationEvent<
IBatchProcessEvent
>;
export type ConsumerEndBatchProcessEvent = InstrumentationEvent<
IBatchProcessEvent & { duration: number }
>;
export type ConsumerCrashEvent = InstrumentationEvent<{
error: Error;
groupId: string;
}>;
export interface OffsetsByTopicPartition {
topics: TopicOffsets[];
}
export interface EachMessagePayload {
topic: string;
partition: number;
message: KafkaMessage;
}
export interface EachBatchPayload {
batch: Batch;
resolveOffset(offset: string): void;
heartbeat(): Promise<void>;
commitOffsetsIfNecessary(offsets?: Offsets): Promise<void>;
uncommittedOffsets(): OffsetsByTopicPartition;
isRunning(): boolean;
isStale(): boolean;
}
/**
* Type alias to keep compatibility with @types/kafkajs
* @see https://github.com/DefinitelyTyped/DefinitelyTyped/blob/712ad9d59ccca6a3cc92f347fea0d1c7b02f5eeb/types/kafkajs/index.d.ts#L321-L325
*/
export type ConsumerEachMessagePayload = EachMessagePayload;
/**
* Type alias to keep compatibility with @types/kafkajs
* @see https://github.com/DefinitelyTyped/DefinitelyTyped/blob/712ad9d59ccca6a3cc92f347fea0d1c7b02f5eeb/types/kafkajs/index.d.ts#L327-L336
*/
export type ConsumerEachBatchPayload = EachBatchPayload;
export type ConsumerRunConfig = {
autoCommit?: boolean;
autoCommitInterval?: number | null;
autoCommitThreshold?: number | null;
eachBatchAutoResolve?: boolean;
partitionsConsumedConcurrently?: number;
eachBatch?: (payload: EachBatchPayload) => Promise<void>;
eachMessage?: (payload: EachMessagePayload) => Promise<void>;
};
export type ConsumerSubscribeTopic = {
topic: string | RegExp;
fromBeginning?: boolean;
};
export type Consumer = {
connect(): Promise<void>;
disconnect(): Promise<void>;
subscribe(topic: ConsumerSubscribeTopic): Promise<void>;
stop(): Promise<void>;
run(config?: ConsumerRunConfig): Promise<void>;
commitOffsets(
topicPartitions: Array<TopicPartitionOffsetAndMetadata>,
): Promise<void>;
seek(topicPartition: {
topic: string;
partition: number;
offset: string;
}): void;
describeGroup(): Promise<GroupDescription>;
pause(topics: Array<{ topic: string; partitions?: number[] }>): void;
paused(): TopicPartitions[];
resume(topics: Array<{ topic: string; partitions?: number[] }>): void;
on(
eventName: ValueOf<ConsumerEvents>,
listener: (...args: any[]) => void,
): RemoveInstrumentationEventListener<typeof eventName>;
logger(): Logger;
events: ConsumerEvents;
};
export enum CompressionTypes {
None = 0,
GZIP = 1,
Snappy = 2,
LZ4 = 3,
ZSTD = 4,
}
export let CompressionCodecs: {
[CompressionTypes.GZIP]: () => any;
[CompressionTypes.Snappy]: () => any;
[CompressionTypes.LZ4]: () => any;
[CompressionTypes.ZSTD]: () => any;
};
export declare class KafkaJSError extends Error {
constructor(e: Error | string, metadata?: KafkaJSErrorMetadata);
}
export declare class KafkaJSNonRetriableError extends KafkaJSError {
constructor(e: Error | string);
}
export declare class KafkaJSProtocolError extends KafkaJSError {
constructor(e: Error | string);
}
export declare class KafkaJSOffsetOutOfRange extends KafkaJSProtocolError {
constructor(e: Error | string, metadata?: KafkaJSOffsetOutOfRangeMetadata);
}
export declare class KafkaJSNumberOfRetriesExceeded extends KafkaJSNonRetriableError {
constructor(
e: Error | string,
metadata?: KafkaJSNumberOfRetriesExceededMetadata,
);
}
export declare class KafkaJSConnectionError extends KafkaJSError {
constructor(e: Error | string, metadata?: KafkaJSConnectionErrorMetadata);
}
export declare class KafkaJSRequestTimeoutError extends KafkaJSError {
constructor(e: Error | string, metadata?: KafkaJSRequestTimeoutErrorMetadata);
}
export declare class KafkaJSMetadataNotLoaded extends KafkaJSError {
constructor();
}
export declare class KafkaJSTopicMetadataNotLoaded extends KafkaJSMetadataNotLoaded {
constructor(
e: Error | string,
metadata?: KafkaJSTopicMetadataNotLoadedMetadata,
);
}
export declare class KafkaJSStaleTopicMetadataAssignment extends KafkaJSError {
constructor(
e: Error | string,
metadata?: KafkaJSStaleTopicMetadataAssignmentMetadata,
);
}
export declare class KafkaJSServerDoesNotSupportApiKey extends KafkaJSNonRetriableError {
constructor(
e: Error | string,
metadata?: KafkaJSServerDoesNotSupportApiKeyMetadata,
);
}
export declare class KafkaJSBrokerNotFound extends KafkaJSError {
constructor();
}
export declare class KafkaJSPartialMessageError extends KafkaJSError {
constructor();
}
export declare class KafkaJSSASLAuthenticationError extends KafkaJSError {
constructor();
}
export declare class KafkaJSGroupCoordinatorNotFound extends KafkaJSError {
constructor();
}
export declare class KafkaJSNotImplemented extends KafkaJSError {
constructor();
}
export declare class KafkaJSTimeout extends KafkaJSError {
constructor();
}
export declare class KafkaJSLockTimeout extends KafkaJSError {
constructor();
}
export declare class KafkaJSUnsupportedMagicByteInMessageSet extends KafkaJSError {
constructor();
}
export declare class KafkaJSDeleteGroupsError extends KafkaJSError {
constructor(e: Error | string, groups?: KafkaJSDeleteGroupsErrorGroups[]);
}
export interface KafkaJSDeleteGroupsErrorGroups {
groupId: string;
errorCode: number;
error: KafkaJSError;
}
export interface KafkaJSErrorMetadata {
retriable?: boolean;
topic?: string;
partitionId?: number;
metadata?: PartitionMetadata;
}
export interface KafkaJSOffsetOutOfRangeMetadata {
topic: string;
partition: number;
}
export interface KafkaJSNumberOfRetriesExceededMetadata {
retryCount: number;
retryTime: number;
}
export interface KafkaJSConnectionErrorMetadata {
broker?: string;
code?: string;
}
export interface KafkaJSRequestTimeoutErrorMetadata {
broker: string;
clientId: string;
correlationId: number;
createdAt: number;
sentAt: number;
pendingDuration: number;
}
export interface KafkaJSTopicMetadataNotLoadedMetadata {
topic: string;
}
export interface KafkaJSStaleTopicMetadataAssignmentMetadata {
topic: string;
unknownPartitions: PartitionMetadata[];
}
export interface KafkaJSServerDoesNotSupportApiKeyMetadata {
apiKey: number;
apiName: string;
}

View File

@@ -1,9 +1,19 @@
/**
* @see https://github.com/DefinitelyTyped/DefinitelyTyped/blob/master/types/kafkajs/index.d.ts
* Do NOT add NestJS logic to this interface. It is meant to ONLY represent the types for the kafkajs package.
*
* @see https://github.com/tulios/kafkajs/blob/master/types/index.d.ts
*/
/// <reference types="node" />
import * as net from 'net';
import * as tls from 'tls';
type Without<T, U> = { [P in Exclude<keyof T, keyof U>]?: never };
type XOR<T, U> = T | U extends object
? (Without<T, U> & U) | (Without<U, T> & T)
: T | U;
export declare class Kafka {
constructor(config: KafkaConfig);
producer(config?: ProducerConfig): Producer;
@@ -19,7 +29,6 @@ export interface KafkaConfig {
ssl?: tls.ConnectionOptions | boolean;
sasl?: SASLOptions;
clientId?: string;
clientIdPostfix?: string;
connectionTimeout?: number;
authenticationTimeout?: number;
reauthenticationThreshold?: number;
@@ -31,19 +40,40 @@ export interface KafkaConfig {
logCreator?: logCreator;
}
export type ISocketFactory = (
host: string,
port: number,
ssl: tls.ConnectionOptions,
onConnect: () => void,
) => net.Socket;
export interface SASLOptions {
mechanism: 'plain' | 'scram-sha-256' | 'scram-sha-512' | 'aws';
username: string;
password: string;
export interface ISocketFactoryArgs {
host: string;
port: number;
ssl: tls.ConnectionOptions;
onConnect: () => void;
}
export type ISocketFactory = (args: ISocketFactoryArgs) => net.Socket;
export interface OauthbearerProviderResponse {
value: string;
}
type SASLMechanismOptionsMap = {
plain: { username: string; password: string };
'scram-sha-256': { username: string; password: string };
'scram-sha-512': { username: string; password: string };
aws: {
authorizationIdentity: string;
accessKeyId: string;
secretAccessKey: string;
sessionToken?: string;
};
oauthbearer: {
oauthBearerProvider: () => Promise<OauthbearerProviderResponse>;
};
};
export type SASLMechanism = keyof SASLMechanismOptionsMap;
type SASLMechanismOptions<T> = T extends SASLMechanism
? { mechanism: T } & SASLMechanismOptionsMap[T]
: never;
export type SASLOptions = SASLMechanismOptions<SASLMechanism>;
export interface ProducerConfig {
createPartitioner?: ICustomPartitioner;
retry?: RetryOptions;
@@ -70,24 +100,25 @@ export interface PartitionerArgs {
}
export type ICustomPartitioner = () => (args: PartitionerArgs) => number;
export type DefaultPartitioner = (args: PartitionerArgs) => number;
export type JavaCompatiblePartitioner = (args: PartitionerArgs) => number;
export type DefaultPartitioner = ICustomPartitioner;
export type JavaCompatiblePartitioner = ICustomPartitioner;
export let Partitioners: {
DefaultPartitioner: DefaultPartitioner;
JavaCompatiblePartitioner: JavaCompatiblePartitioner;
};
export interface PartitionMetadata {
export type PartitionMetadata = {
partitionErrorCode: number;
partitionId: number;
leader: number;
replicas: number[];
isr: number[];
}
offlineReplicas?: number[];
};
export interface IHeaders {
[key: string]: Buffer;
[key: string]: Buffer | string | undefined;
}
export interface ConsumerConfig {
@@ -101,15 +132,16 @@ export interface ConsumerConfig {
minBytes?: number;
maxBytes?: number;
maxWaitTimeInMs?: number;
retry?: RetryOptions;
retry?: RetryOptions & {
restartOnFailure?: (err: Error) => Promise<boolean>;
};
allowAutoTopicCreation?: boolean;
maxInFlightRequests?: number;
readUncommitted?: boolean;
rackId?: string;
}
export interface PartitionAssigner {
new (config: { cluster: Cluster }): Assigner;
}
export type PartitionAssigner = (config: { cluster: Cluster }) => Assigner;
export interface CoordinatorMetadata {
errorCode: number;
@@ -120,7 +152,7 @@ export interface CoordinatorMetadata {
};
}
export interface Cluster {
export type Cluster = {
isConnected(): boolean;
connect(): Promise<void>;
disconnect(): Promise<void>;
@@ -140,46 +172,38 @@ export interface Cluster {
}): Promise<CoordinatorMetadata>;
defaultOffset(config: { fromBeginning: boolean }): number;
fetchTopicsOffset(
topics: Array<{
topic: string;
partitions: Array<{ partition: number }>;
fromBeginning: boolean;
}>,
topics: Array<
{
topic: string;
partitions: Array<{ partition: number }>;
} & XOR<{ fromBeginning: boolean }, { fromTimestamp: number }>
>,
): Promise<{
topic: string;
partitions: Array<{ partition: number; offset: string }>;
}>;
}
};
export interface Assignment {
[topic: string]: number[];
}
export type Assignment = { [topic: string]: number[] };
export interface GroupMember {
memberId: string;
memberMetadata: MemberMetadata;
}
export type GroupMember = { memberId: string; memberMetadata: Buffer };
export interface GroupMemberAssignment {
export type GroupMemberAssignment = {
memberId: string;
memberAssignment: Buffer;
}
};
export interface GroupState {
name: string;
metadata: Buffer;
}
export type GroupState = { name: string; metadata: Buffer };
export interface Assigner {
export type Assigner = {
name: string;
version: number;
assign(group: {
members: GroupMember[];
topics: string[];
userData: Buffer;
}): Promise<GroupMemberAssignment[]>;
protocol(subscription: { topics: string[]; userData: Buffer }): GroupState;
}
protocol(subscription: { topics: string[] }): GroupState;
};
export interface RetryOptions {
maxRetryTime?: number;
@@ -201,12 +225,22 @@ export interface ITopicConfig {
configEntries?: object[];
}
export interface ITopicPartitionConfig {
topic: string;
count: number;
assignments?: Array<Array<number>>;
}
export interface ITopicMetadata {
name: string;
partitions: PartitionMetadata[];
}
export enum ResourceType {
/**
* @deprecated
* Use ConfigResourceTypes or AclResourceTypes
*/
export enum ResourceTypes {
UNKNOWN = 0,
ANY = 1,
TOPIC = 2,
@@ -216,10 +250,58 @@ export enum ResourceType {
DELEGATION_TOKEN = 6,
}
export enum AclResourceTypes {
UNKNOWN = 0,
ANY = 1,
TOPIC = 2,
GROUP = 3,
CLUSTER = 4,
TRANSACTIONAL_ID = 5,
DELEGATION_TOKEN = 6,
}
export enum ConfigResourceTypes {
UNKNOWN = 0,
TOPIC = 2,
BROKER = 4,
BROKER_LOGGER = 8,
}
export enum AclPermissionTypes {
UNKNOWN = 0,
ANY = 1,
DENY = 2,
ALLOW = 3,
}
export enum AclOperationTypes {
UNKNOWN = 0,
ANY = 1,
ALL = 2,
READ = 3,
WRITE = 4,
CREATE = 5,
DELETE = 6,
ALTER = 7,
DESCRIBE = 8,
CLUSTER_ACTION = 9,
DESCRIBE_CONFIGS = 10,
ALTER_CONFIGS = 11,
IDEMPOTENT_WRITE = 12,
}
export enum ResourcePatternTypes {
UNKNOWN = 0,
ANY = 1,
MATCH = 2,
LITERAL = 3,
PREFIXED = 4,
}
export interface ResourceConfigQuery {
type: ResourceType;
type: ResourceTypes | ConfigResourceTypes;
name: string;
configNames: string[];
configNames?: string[];
}
export interface ConfigEntries {
@@ -243,26 +325,26 @@ export interface DescribeConfigResponse {
errorCode: number;
errorMessage: string;
resourceName: string;
resourceType: ResourceType;
resourceType: ResourceTypes | ConfigResourceTypes;
}[];
throttleTime: number;
}
export interface IResourceConfig {
type: ResourceType;
type: ResourceTypes | ConfigResourceTypes;
name: string;
configEntries: { name: string; value: string }[];
}
type ValueOf<T> = T[keyof T];
export interface AdminEvents {
export type AdminEvents = {
CONNECT: 'admin.connect';
DISCONNECT: 'admin.disconnect';
REQUEST: 'admin.network.request';
REQUEST_TIMEOUT: 'admin.network.request_timeout';
REQUEST_QUEUE_SIZE: 'admin.network.request_queue_size';
}
};
export interface InstrumentationEvent<T> {
id: string;
@@ -271,6 +353,8 @@ export interface InstrumentationEvent<T> {
payload: T;
}
export type RemoveInstrumentationEventListener<T> = () => void;
export type ConnectEvent = InstrumentationEvent<null>;
export type DisconnectEvent = InstrumentationEvent<null>;
export type RequestEvent = InstrumentationEvent<{
@@ -308,9 +392,69 @@ export interface SeekEntry {
offset: string;
}
export interface Admin {
export interface Acl {
principal: string;
host: string;
operation: AclOperationTypes;
permissionType: AclPermissionTypes;
}
export interface AclResource {
resourceType: AclResourceTypes;
resourceName: string;
resourcePatternType: ResourcePatternTypes;
}
export type AclEntry = Acl & AclResource;
export type DescribeAclResource = AclResource & {
acl: Acl[];
};
export interface DescribeAclResponse {
throttleTime: number;
errorCode: number;
errorMessage?: string;
resources: DescribeAclResource[];
}
export interface AclFilter {
resourceType: AclResourceTypes;
resourceName?: string;
resourcePatternType: ResourcePatternTypes;
principal?: string;
host?: string;
operation: AclOperationTypes;
permissionType: AclPermissionTypes;
}
export interface MatchingAcl {
errorCode: number;
errorMessage?: string;
resourceType: AclResourceTypes;
resourceName: string;
resourcePatternType: ResourcePatternTypes;
principal: string;
host: string;
operation: AclOperationTypes;
permissionType: AclPermissionTypes;
}
export interface DeleteAclFilterResponses {
errorCode: number;
errorMessage?: string;
matchingAcls: MatchingAcl[];
}
export interface DeleteAclResponse {
throttleTime: number;
filterResponses: DeleteAclFilterResponses[];
}
export type Admin = {
connect(): Promise<void>;
disconnect(): Promise<void>;
listTopics(): Promise<string[]>;
createTopics(options: {
validateOnly?: boolean;
waitForLeaders?: boolean;
@@ -318,20 +462,31 @@ export interface Admin {
topics: ITopicConfig[];
}): Promise<boolean>;
deleteTopics(options: { topics: string[]; timeout?: number }): Promise<void>;
fetchTopicMetadata(options: {
createPartitions(options: {
validateOnly?: boolean;
timeout?: number;
topicPartitions: ITopicPartitionConfig[];
}): Promise<boolean>;
fetchTopicMetadata(options?: {
topics: string[];
}): Promise<{ topics: Array<ITopicMetadata> }>;
fetchOffsets(options: {
groupId: string;
topic: string;
}): Promise<
Array<{ partition: number; offset: string; metadata: string | null }>
>;
resolveOffsets?: boolean;
}): Promise<Array<SeekEntry & { metadata: string | null }>>;
fetchTopicOffsets(
topic: string,
): Promise<
Array<{ partition: number; offset: string; high: string; low: string }>
>;
): Promise<Array<SeekEntry & { high: string; low: string }>>;
fetchTopicOffsetsByTimestamp(
topic: string,
timestamp?: number,
): Promise<Array<SeekEntry>>;
describeCluster(): Promise<{
brokers: Array<{ nodeId: number; host: string; port: number }>;
controller: number | null;
clusterId: string;
}>;
setOffsets(options: {
groupId: string;
topic: string;
@@ -350,29 +505,42 @@ export interface Admin {
validateOnly: boolean;
resources: IResourceConfig[];
}): Promise<any>;
listGroups(): Promise<{ groups: GroupOverview[] }>;
deleteGroups(groupIds: string[]): Promise<DeleteGroupsResult[]>;
describeGroups(groupIds: string[]): Promise<GroupDescriptions>;
describeAcls(options: AclFilter): Promise<DescribeAclResponse>;
deleteAcls(options: { filters: AclFilter[] }): Promise<DeleteAclResponse>;
createAcls(options: { acl: AclEntry[] }): Promise<boolean>;
deleteTopicRecords(options: {
topic: string;
partitions: SeekEntry[];
}): Promise<void>;
logger(): Logger;
on(eventName: ValueOf<AdminEvents>, listener: (...args: any[]) => void): void;
on(
eventName: ValueOf<AdminEvents>,
listener: (...args: any[]) => void,
): RemoveInstrumentationEventListener<typeof eventName>;
events: AdminEvents;
}
};
export let PartitionAssigners: { roundRobin: PartitionAssigner };
export interface ISerializer<T> {
encode(value: T): Buffer;
decode(buffer: Buffer): T;
decode(buffer: Buffer): T | null;
}
export interface MemberMetadata {
export type MemberMetadata = {
version: number;
topics: string[];
userData: Buffer;
}
};
export interface MemberAssignment {
export type MemberAssignment = {
version: number;
assignment: Assignment;
userData: Buffer;
}
};
export let AssignerProtocol: {
MemberMetadata: ISerializer<MemberMetadata>;
@@ -400,11 +568,16 @@ export interface LoggerEntryContent {
[key: string]: any;
}
export type Logger = (entry: LogEntry) => void;
export type logCreator = (logLevel: logLevel) => (entry: LogEntry) => void;
export type logCreator = (logLevel: string) => (entry: LogEntry) => void;
export type Logger = {
info: (message: string, extra?: object) => void;
error: (message: string, extra?: object) => void;
warn: (message: string, extra?: object) => void;
debug: (message: string, extra?: object) => void;
};
export interface Broker {
export type Broker = {
isConnected(): boolean;
connect(): Promise<void>;
disconnect(): Promise<void>;
@@ -414,7 +587,12 @@ export interface Broker {
metadata(
topics: string[],
): Promise<{
brokers: Array<{ nodeId: number; host: string; port: number }>;
brokers: Array<{
nodeId: number;
host: string;
port: number;
rack?: string;
}>;
topicMetadata: Array<{
topicErrorCode: number;
topic: number;
@@ -431,17 +609,33 @@ export interface Broker {
partitions: Array<{ partition: number; offset: string }>;
}>;
}): Promise<any>;
}
fetch(request: {
replicaId?: number;
isolationLevel?: number;
maxWaitTime?: number;
minBytes?: number;
maxBytes?: number;
topics: Array<{
topic: string;
partitions: Array<{
partition: number;
fetchOffset: string;
maxBytes: number;
}>;
}>;
rackId?: string;
}): Promise<any>;
};
export interface KafkaMessage {
export type KafkaMessage = {
key: Buffer;
value: Buffer;
value: Buffer | null;
timestamp: string;
size: number;
attributes: number;
offset: string;
headers?: IHeaders;
}
};
export interface ProducerRecord {
topic: string;
@@ -451,13 +645,16 @@ export interface ProducerRecord {
compression?: CompressionTypes;
}
export interface RecordMetadata {
export type RecordMetadata = {
topicName: string;
partition: number;
errorCode: number;
offset: string;
timestamp: string;
}
offset?: string;
timestamp?: string;
baseOffset?: string;
logAppendTime?: string;
logStartOffset?: string;
};
export interface TopicMessages {
topic: string;
@@ -465,10 +662,10 @@ export interface TopicMessages {
}
export interface ProducerBatch {
acks: number;
timeout: number;
compression: CompressionTypes;
topicMessages: TopicMessages[];
acks?: number;
timeout?: number;
compression?: CompressionTypes;
topicMessages?: TopicMessages[];
}
export interface PartitionOffset {
@@ -485,18 +682,18 @@ export interface Offsets {
topics: TopicOffsets[];
}
interface Sender {
type Sender = {
send(record: ProducerRecord): Promise<RecordMetadata[]>;
sendBatch(batch: ProducerBatch): Promise<RecordMetadata[]>;
}
};
export interface ProducerEvents {
export type ProducerEvents = {
CONNECT: 'producer.connect';
DISCONNECT: 'producer.disconnect';
REQUEST: 'producer.network.request';
REQUEST_TIMEOUT: 'producer.network.request_timeout';
REQUEST_QUEUE_SIZE: 'producer.network.request_queue_size';
}
};
export type Producer = Sender & {
connect(): Promise<void>;
@@ -506,7 +703,7 @@ export type Producer = Sender & {
on(
eventName: ValueOf<ProducerEvents>,
listener: (...args: any[]) => void,
): void;
): RemoveInstrumentationEventListener<typeof eventName>;
transaction(): Promise<Transaction>;
logger(): Logger;
};
@@ -518,41 +715,54 @@ export type Transaction = Sender & {
isActive(): boolean;
};
export interface ConsumerGroup {
export type ConsumerGroup = {
groupId: string;
generationId: number;
memberId: string;
coordinator: Broker;
}
};
export interface MemberDescription {
export type MemberDescription = {
clientHost: string;
clientId: string;
memberId: string;
memberAssignment: Buffer;
memberMetadata: Buffer;
}
};
export interface GroupDescription {
// See https://github.com/apache/kafka/blob/2.4.0/clients/src/main/java/org/apache/kafka/common/ConsumerGroupState.java#L25
export type ConsumerGroupState =
| 'Unknown'
| 'PreparingRebalance'
| 'CompletingRebalance'
| 'Stable'
| 'Dead'
| 'Empty';
export type GroupDescription = {
groupId: string;
members: MemberDescription[];
protocol: string;
protocolType: string;
state: string;
}
state: ConsumerGroupState;
};
export interface TopicPartitions {
topic: string;
partitions: number[];
}
export interface TopicPartitionOffsetAndMedata {
export type GroupDescriptions = {
groups: GroupDescription[];
};
export type TopicPartitions = { topic: string; partitions: number[] };
export type TopicPartitionOffsetAndMetadata = {
topic: string;
partition: number;
offset: string;
metadata?: string | null;
}
};
export interface Batch {
// TODO: Remove with 2.x
export type TopicPartitionOffsetAndMedata = TopicPartitionOffsetAndMetadata;
export type Batch = {
topic: string;
partition: number;
highWatermark: string;
@@ -562,12 +772,24 @@ export interface Batch {
lastOffset(): string;
offsetLag(): string;
offsetLagLow(): string;
}
};
export interface ConsumerEvents {
export type GroupOverview = {
groupId: string;
protocolType: string;
};
export type DeleteGroupsResult = {
groupId: string;
errorCode?: number;
error?: KafkaJSProtocolError;
};
export type ConsumerEvents = {
HEARTBEAT: 'consumer.heartbeat';
COMMIT_OFFSETS: 'consumer.commit_offsets';
GROUP_JOIN: 'consumer.group_join';
FETCH_START: 'consumer.fetch_start';
FETCH: 'consumer.fetch';
START_BATCH_PROCESS: 'consumer.start_batch_process';
END_BATCH_PROCESS: 'consumer.end_batch_process';
@@ -575,10 +797,11 @@ export interface ConsumerEvents {
DISCONNECT: 'consumer.disconnect';
STOP: 'consumer.stop';
CRASH: 'consumer.crash';
RECEIVED_UNSUBSCRIBED_TOPICS: 'consumer.received_unsubscribed_topics';
REQUEST: 'consumer.network.request';
REQUEST_TIMEOUT: 'consumer.network.request_timeout';
REQUEST_QUEUE_SIZE: 'consumer.network.request_queue_size';
}
};
export type ConsumerHeartbeatEvent = InstrumentationEvent<{
groupId: string;
memberId: string;
@@ -622,15 +845,22 @@ interface IBatchProcessEvent {
firstOffset: string;
lastOffset: string;
}
export type ConsumerStartBatchProcessEvent = InstrumentationEvent<
IBatchProcessEvent
>;
export type ConsumerStartBatchProcessEvent = InstrumentationEvent<IBatchProcessEvent>;
export type ConsumerEndBatchProcessEvent = InstrumentationEvent<
IBatchProcessEvent & { duration: number }
>;
export type ConsumerCrashEvent = InstrumentationEvent<{
error: Error;
groupId: string;
restart: boolean;
}>;
export type ConsumerReceivedUnsubcribedTopicsEvent = InstrumentationEvent<{
groupId: string;
generationId: number;
memberId: string;
assignedTopics: string[];
topicsSubscribed: string[];
topicsNotSubscribed: string[];
}>;
export interface OffsetsByTopicPartition {
@@ -648,7 +878,7 @@ export interface EachBatchPayload {
resolveOffset(offset: string): void;
heartbeat(): Promise<void>;
commitOffsetsIfNecessary(offsets?: Offsets): Promise<void>;
uncommittedOffsets(): Promise<OffsetsByTopicPartition>;
uncommittedOffsets(): OffsetsByTopicPartition;
isRunning(): boolean;
isStale(): boolean;
}
@@ -665,25 +895,29 @@ export type ConsumerEachMessagePayload = EachMessagePayload;
*/
export type ConsumerEachBatchPayload = EachBatchPayload;
export interface Consumer {
export type ConsumerRunConfig = {
autoCommit?: boolean;
autoCommitInterval?: number | null;
autoCommitThreshold?: number | null;
eachBatchAutoResolve?: boolean;
partitionsConsumedConcurrently?: number;
eachBatch?: (payload: EachBatchPayload) => Promise<void>;
eachMessage?: (payload: EachMessagePayload) => Promise<void>;
};
export type ConsumerSubscribeTopic = {
topic: string | RegExp;
fromBeginning?: boolean;
};
export type Consumer = {
connect(): Promise<void>;
disconnect(): Promise<void>;
subscribe(topic: {
topic: string | RegExp;
fromBeginning?: boolean;
}): Promise<void>;
subscribe(topic: ConsumerSubscribeTopic): Promise<void>;
stop(): Promise<void>;
run(config?: {
autoCommit?: boolean;
autoCommitInterval?: number | null;
autoCommitThreshold?: number | null;
eachBatchAutoResolve?: boolean;
partitionsConsumedConcurrently?: number;
eachBatch?: (payload: EachBatchPayload) => Promise<void>;
eachMessage?: (payload: EachMessagePayload) => Promise<void>;
}): Promise<void>;
run(config?: ConsumerRunConfig): Promise<void>;
commitOffsets(
topicPartitions: Array<TopicPartitionOffsetAndMedata>,
topicPartitions: Array<TopicPartitionOffsetAndMetadata>,
): Promise<void>;
seek(topicPartition: {
topic: string;
@@ -692,14 +926,15 @@ export interface Consumer {
}): void;
describeGroup(): Promise<GroupDescription>;
pause(topics: Array<{ topic: string; partitions?: number[] }>): void;
paused(): TopicPartitions[];
resume(topics: Array<{ topic: string; partitions?: number[] }>): void;
on(
eventName: ValueOf<ConsumerEvents>,
listener: (...args: any[]) => void,
): void;
): RemoveInstrumentationEventListener<typeof eventName>;
logger(): Logger;
events: ConsumerEvents;
}
};
export enum CompressionTypes {
None = 0,
@@ -715,3 +950,186 @@ export let CompressionCodecs: {
[CompressionTypes.LZ4]: () => any;
[CompressionTypes.ZSTD]: () => any;
};
export declare class KafkaJSError extends Error {
readonly message: Error['message'];
readonly name: string;
readonly retriable: boolean;
readonly helpUrl?: string;
constructor(e: Error | string, metadata?: KafkaJSErrorMetadata);
}
export declare class KafkaJSNonRetriableError extends KafkaJSError {
constructor(e: Error | string);
}
export declare class KafkaJSProtocolError extends KafkaJSError {
readonly code: number;
readonly type: string;
constructor(e: Error | string);
}
export declare class KafkaJSOffsetOutOfRange extends KafkaJSProtocolError {
readonly topic: string;
readonly partition: number;
constructor(e: Error | string, metadata?: KafkaJSOffsetOutOfRangeMetadata);
}
export declare class KafkaJSNumberOfRetriesExceeded extends KafkaJSNonRetriableError {
readonly stack: string;
readonly originalError: Error;
readonly retryCount: number;
readonly retryTime: number;
constructor(
e: Error | string,
metadata?: KafkaJSNumberOfRetriesExceededMetadata,
);
}
export declare class KafkaJSConnectionError extends KafkaJSError {
readonly broker: string;
constructor(e: Error | string, metadata?: KafkaJSConnectionErrorMetadata);
}
export declare class KafkaJSRequestTimeoutError extends KafkaJSError {
readonly broker: string;
readonly correlationId: number;
readonly createdAt: number;
readonly sentAt: number;
readonly pendingDuration: number;
constructor(e: Error | string, metadata?: KafkaJSRequestTimeoutErrorMetadata);
}
export declare class KafkaJSMetadataNotLoaded extends KafkaJSError {
constructor();
}
export declare class KafkaJSTopicMetadataNotLoaded extends KafkaJSMetadataNotLoaded {
readonly topic: string;
constructor(
e: Error | string,
metadata?: KafkaJSTopicMetadataNotLoadedMetadata,
);
}
export declare class KafkaJSStaleTopicMetadataAssignment extends KafkaJSError {
readonly topic: string;
readonly unknownPartitions: number;
constructor(
e: Error | string,
metadata?: KafkaJSStaleTopicMetadataAssignmentMetadata,
);
}
export declare class KafkaJSServerDoesNotSupportApiKey extends KafkaJSNonRetriableError {
readonly apiKey: number;
readonly apiName: string;
constructor(
e: Error | string,
metadata?: KafkaJSServerDoesNotSupportApiKeyMetadata,
);
}
export declare class KafkaJSBrokerNotFound extends KafkaJSError {
constructor();
}
export declare class KafkaJSPartialMessageError extends KafkaJSError {
constructor();
}
export declare class KafkaJSSASLAuthenticationError extends KafkaJSError {
constructor();
}
export declare class KafkaJSGroupCoordinatorNotFound extends KafkaJSError {
constructor();
}
export declare class KafkaJSNotImplemented extends KafkaJSError {
constructor();
}
export declare class KafkaJSTimeout extends KafkaJSError {
constructor();
}
export declare class KafkaJSLockTimeout extends KafkaJSError {
constructor();
}
export declare class KafkaJSUnsupportedMagicByteInMessageSet extends KafkaJSError {
constructor();
}
export declare class KafkaJSDeleteGroupsError extends KafkaJSError {
readonly groups: DeleteGroupsResult[];
constructor(e: Error | string, groups?: KafkaJSDeleteGroupsErrorGroups[]);
}
export declare class KafkaJSDeleteTopicRecordsError extends KafkaJSError {
constructor(metadata: KafkaJSDeleteTopicRecordsErrorTopic);
}
export interface KafkaJSDeleteGroupsErrorGroups {
groupId: string;
errorCode: number;
error: KafkaJSError;
}
export interface KafkaJSDeleteTopicRecordsErrorTopic {
topic: string;
partitions: KafkaJSDeleteTopicRecordsErrorPartition[];
}
export interface KafkaJSDeleteTopicRecordsErrorPartition {
partition: number;
offset: string;
error: KafkaJSError;
}
export interface KafkaJSErrorMetadata {
retriable?: boolean;
topic?: string;
partitionId?: number;
metadata?: PartitionMetadata;
}
export interface KafkaJSOffsetOutOfRangeMetadata {
topic: string;
partition: number;
}
export interface KafkaJSNumberOfRetriesExceededMetadata {
retryCount: number;
retryTime: number;
}
export interface KafkaJSConnectionErrorMetadata {
broker?: string;
code?: string;
}
export interface KafkaJSRequestTimeoutErrorMetadata {
broker: string;
clientId: string;
correlationId: number;
createdAt: number;
sentAt: number;
pendingDuration: number;
}
export interface KafkaJSTopicMetadataNotLoadedMetadata {
topic: string;
}
export interface KafkaJSStaleTopicMetadataAssignmentMetadata {
topic: string;
unknownPartitions: PartitionMetadata[];
}
export interface KafkaJSServerDoesNotSupportApiKeyMetadata {
apiKey: number;
apiName: string;
}

View File

@@ -9,3 +9,22 @@ export interface RmqUrl {
heartbeat?: number;
vhost?: string;
}
export interface AmqpConnectionManagerSocketOptions {
reconnectTimeInSeconds?: number;
heartbeatIntervalInSeconds?: number;
findServers?: () => string | string[];
connectionOptions?: any;
}
export interface AmqplibQueueOptions {
durable?: boolean;
autoDelete?: boolean;
arguments?: any;
messageTtl?: number;
expires?: number;
deadLetterExchange?: string;
deadLetterRoutingKey?: string;
maxLength?: number;
maxPriority?: number;
}

View File

@@ -1,4 +1,4 @@
export * from './json-socket';
export * from './kafka-logger';
export * from './kafka-parser';
export * from './kafka-round-robin-partition-assigner';
export * from './kafka-reply-partition-assigner';

View File

@@ -0,0 +1,202 @@
import { loadPackage } from '@nestjs/common/utils/load-package.util';
import { isUndefined } from '@nestjs/common/utils/shared.utils';
import { ClientKafka } from '../client/client-kafka';
import {
Cluster,
GroupMember,
GroupMemberAssignment,
GroupState,
MemberMetadata,
} from '../external/kafka.interface';
let kafkaPackage: any = {};
export class KafkaReplyPartitionAssigner {
readonly name = 'NestReplyPartitionAssigner';
readonly version = 1;
constructor(
private readonly clientKafka: ClientKafka,
private readonly config: {
cluster: Cluster;
},
) {
kafkaPackage = loadPackage(
'kafkajs',
KafkaReplyPartitionAssigner.name,
() => require('kafkajs'),
);
}
/**
* This process can result in imbalanced assignments
* @param {array} members array of members, e.g: [{ memberId: 'test-5f93f5a3' }]
* @param {array} topics
* @param {Buffer} userData
* @returns {array} object partitions per topic per member
*/
public async assign(group: {
members: GroupMember[];
topics: string[];
}): Promise<GroupMemberAssignment[]> {
const assignment = {};
const previousAssignment = {};
const membersCount = group.members.length;
const decodedMembers = group.members.map(member =>
this.decodeMember(member),
);
const sortedMemberIds = decodedMembers
.map(member => member.memberId)
.sort();
// build the previous assignment and an inverse map of topic > partition > memberId for lookup
decodedMembers.forEach(member => {
if (
!previousAssignment[member.memberId] &&
Object.keys(member.previousAssignment).length > 0
) {
previousAssignment[member.memberId] = member.previousAssignment;
}
});
// build a collection of topics and partitions
const topicsPartitions = group.topics
.map(topic => {
const partitionMetadata = this.config.cluster.findTopicPartitionMetadata(
topic,
);
return partitionMetadata.map(m => {
return {
topic,
partitionId: m.partitionId,
};
});
})
.reduce((acc, val) => acc.concat(val), []);
// create the new assignment by populating the members with the first partition of the topics
sortedMemberIds.forEach(assignee => {
if (!assignment[assignee]) {
assignment[assignee] = {};
}
// add topics to each member
group.topics.forEach(topic => {
if (!assignment[assignee][topic]) {
assignment[assignee][topic] = [];
}
// see if the topic and partition belong to a previous assignment
if (
previousAssignment[assignee] &&
!isUndefined(previousAssignment[assignee][topic])
) {
// take the minimum partition since replies will be sent to the minimum partition
const firstPartition = previousAssignment[assignee][topic];
// create the assignment with the first partition
assignment[assignee][topic].push(firstPartition);
// find and remove this topic and partition from the topicPartitions to be assigned later
const topicsPartitionsIndex = topicsPartitions.findIndex(
topicPartition => {
return (
topicPartition.topic === topic &&
topicPartition.partitionId === firstPartition
);
},
);
// only continue if we found a partition matching this topic
if (topicsPartitionsIndex !== -1) {
// remove inline
topicsPartitions.splice(topicsPartitionsIndex, 1);
}
}
});
});
// check for member topics that have a partition length of 0
sortedMemberIds.forEach(assignee => {
group.topics.forEach(topic => {
// only continue if there are no partitions for assignee's topic
if (assignment[assignee][topic].length === 0) {
// find the first partition for this topic
const topicsPartitionsIndex = topicsPartitions.findIndex(
topicPartition => {
return topicPartition.topic === topic;
},
);
if (topicsPartitionsIndex !== -1) {
// find and set the topic partition
const partition =
topicsPartitions[topicsPartitionsIndex].partitionId;
assignment[assignee][topic].push(partition);
// remove this partition from the topics partitions collection
topicsPartitions.splice(topicsPartitionsIndex, 1);
}
}
});
});
// then balance out the rest of the topic partitions across the members
const insertAssignmentsByTopic = (topicPartition, i) => {
const assignee = sortedMemberIds[i % membersCount];
assignment[assignee][topicPartition.topic].push(
topicPartition.partitionId,
);
};
// build the assignments
topicsPartitions.forEach(insertAssignmentsByTopic);
// encode the end result
return Object.keys(assignment).map(memberId => ({
memberId,
memberAssignment: kafkaPackage.AssignerProtocol.MemberAssignment.encode({
version: this.version,
assignment: assignment[memberId],
}),
}));
}
public protocol(subscription: {
topics: string[];
userData: Buffer;
}): GroupState {
const stringifiedUserData = JSON.stringify({
previousAssignment: this.getPreviousAssignment(),
});
subscription.userData = Buffer.from(stringifiedUserData);
return {
name: this.name,
metadata: kafkaPackage.AssignerProtocol.MemberMetadata.encode({
version: this.version,
topics: subscription.topics,
userData: subscription.userData,
}),
};
}
public getPreviousAssignment() {
return this.clientKafka.getConsumerAssignments();
}
public decodeMember(member: GroupMember) {
const memberMetadata = kafkaPackage.AssignerProtocol.MemberMetadata.decode(
member.memberMetadata,
) as MemberMetadata;
const memberUserData = JSON.parse(memberMetadata.userData.toString());
return {
memberId: member.memberId,
previousAssignment: memberUserData.previousAssignment,
};
}
}

View File

@@ -1,119 +0,0 @@
import { loadPackage } from '@nestjs/common/utils/load-package.util';
import {
Cluster,
GroupMember,
GroupMemberAssignment,
GroupState,
MemberMetadata,
} from '../external/kafka.interface';
let kafkaPackage: any = {};
const time = process.hrtime();
export class KafkaRoundRobinPartitionAssigner {
readonly name = 'RoundRobinByTime';
readonly version = 1;
constructor(private readonly config: { cluster: Cluster }) {
kafkaPackage = loadPackage(
'kafkajs',
KafkaRoundRobinPartitionAssigner.name,
() => require('kafkajs'),
);
}
/**
* This process can result in imbalanced assignments
* @param {array} members array of members, e.g: [{ memberId: 'test-5f93f5a3' }]
* @param {array} topics
* @param {Buffer} userData
* @returns {array} object partitions per topic per member
*/
public async assign(group: {
members: GroupMember[];
topics: string[];
userData: Buffer;
}): Promise<GroupMemberAssignment[]> {
const membersCount = group.members.length;
const assignment = {};
const sortedMembers = group.members
.map(member => this.mapToTimeAndMemberId(member))
.sort((a, b) => this.sortByTime(a, b))
.map(member => member.memberId);
sortedMembers.forEach(memberId => {
assignment[memberId] = {};
});
const insertAssignmentsByTopic = (topic: string) => {
const partitionMetadata = this.config.cluster.findTopicPartitionMetadata(
topic,
);
const partitions = partitionMetadata.map(m => m.partitionId);
sortedMembers.forEach((memberId, i) => {
if (!assignment[memberId][topic]) {
assignment[memberId][topic] = [];
}
assignment[memberId][topic].push(
...partitions.filter(id => id % membersCount === i),
);
});
};
group.topics.forEach(insertAssignmentsByTopic);
return Object.keys(assignment).map(memberId => ({
memberId,
memberAssignment: kafkaPackage.AssignerProtocol.MemberAssignment.encode({
version: this.version,
assignment: assignment[memberId],
userData: group.userData,
}),
}));
}
public protocol(subscription: {
topics: string[];
userData: Buffer;
}): GroupState {
const stringifiedTimeObject = JSON.stringify({
time: this.getTime(),
});
subscription.userData = Buffer.from(stringifiedTimeObject);
return {
name: this.name,
metadata: kafkaPackage.AssignerProtocol.MemberMetadata.encode({
version: this.version,
topics: subscription.topics,
userData: subscription.userData,
}),
};
}
public getTime(): [number, number] {
return time;
}
public mapToTimeAndMemberId(member: GroupMember) {
const memberMetadata = kafkaPackage.AssignerProtocol.MemberMetadata.decode(
member.memberMetadata,
) as MemberMetadata;
const memberUserData = JSON.parse(memberMetadata.userData.toString());
return {
memberId: member.memberId,
time: memberUserData.time,
};
}
public sortByTime(a: Record<'time', number[]>, b: Record<'time', number[]>) {
// if seconds are equal sort by nanoseconds
if (a.time[0] === b.time[0]) {
return a.time[1] - b.time[1];
}
// sort by seconds
return a.time[0] - b.time[0];
}
}

View File

@@ -1,18 +1,20 @@
import { Transport } from '../enums/transport.enum';
import { ChannelOptions } from '../external/grpc-options.interface';
import {
CompressionTypes,
ConsumerConfig,
ConsumerRunConfig,
ConsumerSubscribeTopic,
KafkaConfig,
ProducerConfig,
} from '../external/kafka-options.interface';
ProducerRecord,
} from '../external/kafka.interface';
import { MqttClientOptions } from '../external/mqtt-options.interface';
import { ClientOpts } from '../external/redis.interface';
import { RmqUrl } from '../external/rmq-url.interface';
import { Server } from '../server/server';
import { CustomTransportStrategy } from './custom-transport-strategy.interface';
import { Deserializer } from './deserializer.interface';
import { Serializer } from './serializer.interface';
import { RmqUrl } from '../external/rmq-url.interface';
export type MicroserviceOptions =
| GrpcOptions
@@ -127,8 +129,8 @@ export interface RmqOptions {
queue?: string;
prefetchCount?: number;
isGlobalPrefetchCount?: boolean;
queueOptions?: any;
socketOptions?: any;
queueOptions?: any; // AmqplibQueueOptions;
socketOptions?: any; // AmqpConnectionManagerSocketOptions;
noAck?: boolean;
serializer?: Serializer;
deserializer?: Deserializer;
@@ -140,24 +142,13 @@ export interface RmqOptions {
export interface KafkaOptions {
transport?: Transport.KAFKA;
options?: {
postfixId?: string;
client?: KafkaConfig;
consumer?: ConsumerConfig;
run?: {
autoCommit?: boolean;
autoCommitInterval?: number | null;
autoCommitThreshold?: number | null;
eachBatchAutoResolve?: boolean;
partitionsConsumedConcurrently?: number;
};
subscribe?: {
fromBeginning?: boolean;
};
run?: Omit<ConsumerRunConfig, 'eachBatch' | 'eachMessage'>;
subscribe?: Omit<ConsumerSubscribeTopic, 'topic'>;
producer?: ProducerConfig;
send?: {
acks?: number;
timeout?: number;
compression?: CompressionTypes;
};
send?: Omit<ProducerRecord, 'topics' | 'messages'>;
serializer?: Serializer;
deserializer?: Deserializer;
};

View File

@@ -123,7 +123,7 @@ export class NestMicroservice
!this.isInitialized && (await this.registerModules());
this.logger.log(MESSAGES.MICROSERVICE_READY);
return new Promise(resolve => this.server.listen(resolve));
return new Promise<void>(resolve => this.server.listen(resolve));
}
public async close(): Promise<any> {

View File

@@ -1,6 +1,6 @@
{
"name": "@nestjs/microservices",
"version": "7.5.5",
"version": "7.6.3",
"description": "Nest - modern, fast, powerful node.js web framework (@microservices)",
"author": "Kamil Mysliwiec",
"license": "MIT",
@@ -22,12 +22,51 @@
"tslib": "2.0.3"
},
"devDependencies": {
"@nestjs/common": "7.5.5",
"@nestjs/core": "7.5.5"
"@nestjs/common": "7.6.3",
"@nestjs/core": "7.6.3"
},
"peerDependencies": {
"@nestjs/common": "^7.0.0",
"@nestjs/core": "^7.0.0",
"@nestjs/websockets": "^7.0.0",
"amqp-connection-manager": "*",
"amqplib": "*",
"cache-manager": "*",
"grpc": "*",
"kafkajs": "*",
"mqtt": "*",
"nats": "*",
"redis": "*",
"reflect-metadata": "^0.1.12",
"rxjs": "^6.0.0"
},
"peerDependenciesMeta": {
"@nestjs/websockets": {
"optional": true
},
"cache-manager": {
"optional": true
},
"grpc": {
"optional": true
},
"kafkajs": {
"optional": true
},
"mqtt": {
"optional": true
},
"nats": {
"optional": true
},
"redis": {
"optional": true
},
"amqplib": {
"optional": true
},
"amqp-connection-manager": {
"optional": true
}
}
}

View File

@@ -19,6 +19,7 @@ import {
KafkaMessage,
Message,
Producer,
RecordMetadata,
} from '../external/kafka.interface';
import { KafkaLogger, KafkaParser } from '../helpers';
import {
@@ -50,17 +51,16 @@ export class ServerKafka extends Server implements CustomTransportStrategy {
this.getOptionsProp(this.options, 'client') || ({} as KafkaConfig);
const consumerOptions =
this.getOptionsProp(this.options, 'consumer') || ({} as ConsumerConfig);
const postfixId =
this.getOptionsProp(this.options, 'postfixId') || '-server';
this.brokers = clientOptions.brokers || [KAFKA_DEFAULT_BROKER];
// append a unique id to the clientId and groupId
// so they don't collide with a microservices client
this.clientId =
(clientOptions.clientId || KAFKA_DEFAULT_CLIENT) +
(clientOptions.clientIdPostfix || '-server');
this.groupId =
(consumerOptions.groupId || KAFKA_DEFAULT_GROUP) +
(clientOptions.clientIdPostfix || '-server');
(clientOptions.clientId || KAFKA_DEFAULT_CLIENT) + postfixId;
this.groupId = (consumerOptions.groupId || KAFKA_DEFAULT_GROUP) + postfixId;
kafkaPackage = this.loadPackage('kafkajs', ServerKafka.name, () =>
require('kafkajs'),
@@ -75,9 +75,9 @@ export class ServerKafka extends Server implements CustomTransportStrategy {
await this.start(callback);
}
public close(): void {
this.consumer && this.consumer.disconnect();
this.producer && this.producer.disconnect();
public async close(): Promise<void> {
this.consumer && (await this.consumer.disconnect());
this.producer && (await this.producer.disconnect());
this.consumer = null;
this.producer = null;
this.client = null;
@@ -130,7 +130,7 @@ export class ServerKafka extends Server implements CustomTransportStrategy {
replyTopic: string,
replyPartition: string,
correlationId: string,
): (data: any) => any {
): (data: any) => Promise<RecordMetadata[]> {
return (data: any) =>
this.sendMessage(data, replyTopic, replyPartition, correlationId);
}
@@ -184,7 +184,7 @@ export class ServerKafka extends Server implements CustomTransportStrategy {
replyTopic: string,
replyPartition: string,
correlationId: string,
): void {
): Promise<RecordMetadata[]> {
const outgoingMessage = this.serializer.serialize(message.response);
this.assignReplyPartition(replyPartition, outgoingMessage);
this.assignCorrelationIdHeader(correlationId, outgoingMessage);
@@ -198,7 +198,7 @@ export class ServerKafka extends Server implements CustomTransportStrategy {
},
this.options.send || {},
);
this.producer.send(replyMessage);
return this.producer.send(replyMessage);
}
public assignIsDisposedHeader(

View File

@@ -18,13 +18,13 @@ import {
} from '../constants';
import { RmqContext } from '../ctx-host';
import { Transport } from '../enums';
import { RmqUrl } from '../external/rmq-url.interface';
import { CustomTransportStrategy, RmqOptions } from '../interfaces';
import {
IncomingRequest,
OutgoingResponse,
} from '../interfaces/packet.interface';
import { Server } from './server';
import { RmqUrl } from '../external/rmq-url.interface';
let rqmPackage: any = {};

View File

@@ -61,14 +61,16 @@ export abstract class Server {
public send(
stream$: Observable<any>,
respond: (data: WritePacket) => void,
respond: (data: WritePacket) => unknown | Promise<unknown>,
): Subscription {
let dataBuffer: WritePacket[] = null;
const scheduleOnNextTick = (data: WritePacket) => {
if (!dataBuffer) {
dataBuffer = [data];
process.nextTick(() => {
dataBuffer.forEach(buffer => respond(buffer));
process.nextTick(async () => {
for (const item of dataBuffer) {
await respond(item);
}
dataBuffer = null;
});
} else if (!data.isDisposed) {

View File

@@ -3,7 +3,6 @@ import * as sinon from 'sinon';
import { ClientKafka } from '../../client/client-kafka';
import { NO_MESSAGE_HANDLER } from '../../constants';
import { KafkaHeaders } from '../../enums';
import { InvalidKafkaClientTopicPartitionException } from '../../errors/invalid-kafka-client-topic-partition.exception';
import { InvalidKafkaClientTopicException } from '../../errors/invalid-kafka-client-topic.exception';
import {
ConsumerGroupJoinEvent,
@@ -269,6 +268,7 @@ describe('ClientKafka', () => {
expect(createClientStub.calledOnce).to.be.true;
expect(producerStub.calledOnce).to.be.true;
expect(consumerStub.calledOnce).to.be.true;
expect(on.calledOnce).to.be.true;
@@ -314,13 +314,19 @@ describe('ClientKafka', () => {
memberId: 'member-1',
memberAssignment: {
'topic-a': [0, 1, 2],
'topic-b': [3, 4, 5],
},
},
};
client['setConsumerAssignments'](consumerAssignments);
expect(client['consumerAssignments']).to.deep.eq(
consumerAssignments.payload.memberAssignment,
// consumerAssignments.payload.memberAssignment,
{
'topic-a': 0,
'topic-b': 3,
},
);
});
});
@@ -493,10 +499,22 @@ describe('ClientKafka', () => {
});
});
describe('getConsumerAssignments', () => {
it('should get consumer assignments', () => {
client['consumerAssignments'] = {
[replyTopic]: 0,
};
const result = client.getConsumerAssignments();
expect(result).to.deep.eq(client['consumerAssignments']);
});
});
describe('getReplyTopicPartition', () => {
it('should get reply partition', () => {
client['consumerAssignments'] = {
[replyTopic]: [0],
[replyTopic]: 0,
};
const result = client['getReplyTopicPartition'](replyTopic);
@@ -504,19 +522,17 @@ describe('ClientKafka', () => {
expect(result).to.eq('0');
});
it('should throw error when the topic is being consumed but is not assigned partitions', () => {
client['consumerAssignments'] = {
[replyTopic]: [],
};
it('should throw error when the topic is not being consumed', () => {
client['consumerAssignments'] = {};
expect(() => client['getReplyTopicPartition'](replyTopic)).to.throw(
InvalidKafkaClientTopicPartitionException,
InvalidKafkaClientTopicException,
);
});
it('should throw error when the topic is not being consumer', () => {
it('should throw error when the topic is not being consumed', () => {
client['consumerAssignments'] = {
[topic]: [],
[topic]: undefined,
};
expect(() => client['getReplyTopicPartition'](replyTopic)).to.throw(
@@ -551,7 +567,7 @@ describe('ClientKafka', () => {
'getReplyTopicPartition',
);
routingMapSetSpy = sinon.spy((client as any).routingMap, 'set');
sendSpy = sinon.spy();
sendSpy = sinon.spy(() => Promise.resolve());
// stub
assignPacketIdStub = sinon
@@ -568,7 +584,7 @@ describe('ClientKafka', () => {
// set
client['consumerAssignments'] = {
[replyTopic]: [parseFloat(replyPartition)],
[replyTopic]: parseFloat(replyPartition),
};
});

View File

@@ -0,0 +1,286 @@
import { expect } from 'chai';
import * as sinon from 'sinon';
import * as Kafka from 'kafkajs';
import { KafkaReplyPartitionAssigner } from '../../helpers/kafka-reply-partition-assigner';
import { ClientKafka } from '../../client/client-kafka';
describe('kafka reply partition assigner', () => {
let cluster, topics, metadata, assigner, client;
let getConsumerAssignments: sinon.SinonSpy;
let getPreviousAssignment: sinon.SinonSpy;
beforeEach(() => {
metadata = {};
cluster = { findTopicPartitionMetadata: topic => metadata[topic] };
client = new ClientKafka({});
assigner = new KafkaReplyPartitionAssigner(client, { cluster });
topics = ['topic-A', 'topic-B'];
getConsumerAssignments = sinon.spy(client, 'getConsumerAssignments');
getPreviousAssignment = sinon.spy(assigner, 'getPreviousAssignment');
// reset previous assignments
(client as any).consumerAssignments = {};
});
describe('assign', () => {
it('assign all partitions evenly', async () => {
metadata['topic-A'] = Array(14)
.fill(1)
.map((_, i) => ({ partitionId: i }));
metadata['topic-B'] = Array(5)
.fill(1)
.map((_, i) => ({ partitionId: i }));
const members = [
{
memberId: 'member-3',
memberMetadata: Kafka.AssignerProtocol.MemberMetadata.encode({
version: assigner.version,
topics: ['topic-A', 'topic-B'],
userData: Buffer.from(
JSON.stringify({
previousAssignment: {},
}),
),
}),
},
{
memberId: 'member-1',
memberMetadata: Kafka.AssignerProtocol.MemberMetadata.encode({
version: assigner.version,
topics: ['topic-A', 'topic-B'],
userData: Buffer.from(
JSON.stringify({
previousAssignment: {},
}),
),
}),
},
{
memberId: 'member-4',
memberMetadata: Kafka.AssignerProtocol.MemberMetadata.encode({
version: assigner.version,
topics: ['topic-A', 'topic-B'],
userData: Buffer.from(
JSON.stringify({
previousAssignment: {},
}),
),
}),
},
{
memberId: 'member-2',
memberMetadata: Kafka.AssignerProtocol.MemberMetadata.encode({
version: assigner.version,
topics: ['topic-A', 'topic-B'],
userData: Buffer.from(
JSON.stringify({
previousAssignment: {},
}),
),
}),
},
];
const assignment = await assigner.assign({ members, topics });
expect(assignment).to.deep.equal([
{
memberId: 'member-1',
memberAssignment: Kafka.AssignerProtocol.MemberAssignment.encode({
version: assigner.version,
assignment: {
'topic-A': [0, 4, 8, 12],
'topic-B': [0],
},
userData: Buffer.alloc(0),
}),
},
{
memberId: 'member-2',
memberAssignment: Kafka.AssignerProtocol.MemberAssignment.encode({
version: assigner.version,
assignment: {
'topic-A': [1, 5, 9, 13],
'topic-B': [1],
},
userData: Buffer.alloc(0),
}),
},
{
memberId: 'member-3',
memberAssignment: Kafka.AssignerProtocol.MemberAssignment.encode({
version: assigner.version,
assignment: {
'topic-A': [2, 6, 10],
'topic-B': [2, 4],
},
userData: Buffer.alloc(0),
}),
},
{
memberId: 'member-4',
memberAssignment: Kafka.AssignerProtocol.MemberAssignment.encode({
version: assigner.version,
assignment: {
'topic-A': [3, 7, 11],
'topic-B': [3],
},
userData: Buffer.alloc(0),
}),
},
]);
});
});
describe('re-assign', () => {
it('assign all partitions evenly', async () => {
metadata['topic-A'] = Array(11)
.fill(1)
.map((_, i) => ({ partitionId: i }));
metadata['topic-B'] = Array(7)
.fill(1)
.map((_, i) => ({ partitionId: i }));
const members = [
{
memberId: 'member-3',
memberMetadata: Kafka.AssignerProtocol.MemberMetadata.encode({
version: assigner.version,
topics: ['topic-A', 'topic-B'],
userData: Buffer.from(
JSON.stringify({
previousAssignment: {
'topic-A': 0,
'topic-B': 0,
},
}),
),
}),
},
{
memberId: 'member-1',
memberMetadata: Kafka.AssignerProtocol.MemberMetadata.encode({
version: assigner.version,
topics: ['topic-A', 'topic-B'],
userData: Buffer.from(
JSON.stringify({
previousAssignment: {
'topic-A': 1,
'topic-B': 1,
},
}),
),
}),
},
{
memberId: 'member-4',
memberMetadata: Kafka.AssignerProtocol.MemberMetadata.encode({
version: assigner.version,
topics: ['topic-A', 'topic-B'],
userData: Buffer.from(
JSON.stringify({
previousAssignment: {
'topic-A': 2,
},
}),
),
}),
},
{
memberId: 'member-2',
memberMetadata: Kafka.AssignerProtocol.MemberMetadata.encode({
version: assigner.version,
topics: ['topic-A', 'topic-B'],
userData: Buffer.from(
JSON.stringify({
previousAssignment: {},
}),
),
}),
},
];
const assignment = await assigner.assign({ members, topics });
expect(assignment).to.deep.equal([
{
memberId: 'member-1',
memberAssignment: Kafka.AssignerProtocol.MemberAssignment.encode({
version: assigner.version,
assignment: {
'topic-A': [1, 4, 8],
'topic-B': [1, 5],
},
userData: Buffer.alloc(0),
}),
},
{
memberId: 'member-2',
memberAssignment: Kafka.AssignerProtocol.MemberAssignment.encode({
version: assigner.version,
assignment: {
'topic-A': [3, 5, 9],
'topic-B': [2, 6],
},
userData: Buffer.alloc(0),
}),
},
{
memberId: 'member-3',
memberAssignment: Kafka.AssignerProtocol.MemberAssignment.encode({
version: assigner.version,
assignment: {
'topic-A': [0, 6, 10],
'topic-B': [0],
},
userData: Buffer.alloc(0),
}),
},
{
memberId: 'member-4',
memberAssignment: Kafka.AssignerProtocol.MemberAssignment.encode({
version: assigner.version,
assignment: {
'topic-A': [2, 7],
'topic-B': [3, 4],
},
userData: Buffer.alloc(0),
}),
},
]);
});
});
describe('protocol', () => {
it('returns the assigner name and metadata', () => {
// set previous assignments
(client as any).consumerAssignments = {
'topic-A': 0,
'topic-B': 1,
};
const protocol = assigner.protocol({ topics });
expect(getPreviousAssignment.calledOnce).to.be.true;
expect(getConsumerAssignments.calledOnce).to.be.true;
expect(protocol).to.deep.equal({
name: assigner.name,
metadata: Kafka.AssignerProtocol.MemberMetadata.encode({
version: assigner.version,
topics,
userData: Buffer.from(
JSON.stringify({
previousAssignment: (client as any).consumerAssignments,
}),
),
}),
});
});
});
});

View File

@@ -1,143 +0,0 @@
import { expect } from 'chai';
import * as Kafka from 'kafkajs';
import { KafkaRoundRobinPartitionAssigner } from '../../helpers/kafka-round-robin-partition-assigner';
describe('kafka round robin by time', () => {
let cluster, topics, metadata, assigner;
beforeEach(() => {
metadata = {};
cluster = { findTopicPartitionMetadata: topic => metadata[topic] };
assigner = new KafkaRoundRobinPartitionAssigner({ cluster });
topics = ['topic-A', 'topic-B'];
});
describe('assign', () => {
it('assign all partitions evenly', async () => {
metadata['topic-A'] = Array(14)
.fill(1)
.map((_, i) => ({ partitionId: i }));
metadata['topic-B'] = Array(5)
.fill(1)
.map((_, i) => ({ partitionId: i }));
const members = [
{
memberId: 'member-3',
memberMetadata: Kafka.AssignerProtocol.MemberMetadata.encode({
version: assigner.version,
topics: ['topic-A', 'topic-B'],
userData: Buffer.from(
JSON.stringify({
time: [0, 0], // process.hrtime()
}),
),
}),
},
{
memberId: 'member-1',
memberMetadata: Kafka.AssignerProtocol.MemberMetadata.encode({
version: assigner.version,
topics: ['topic-A', 'topic-B'],
userData: Buffer.from(
JSON.stringify({
time: [0, 1], // process.hrtime()
}),
),
}),
},
{
memberId: 'member-4',
memberMetadata: Kafka.AssignerProtocol.MemberMetadata.encode({
version: assigner.version,
topics: ['topic-A', 'topic-B'],
userData: Buffer.from(
JSON.stringify({
time: [1, 1], // process.hrtime()
}),
),
}),
},
{
memberId: 'member-2',
memberMetadata: Kafka.AssignerProtocol.MemberMetadata.encode({
version: assigner.version,
topics: ['topic-A', 'topic-B'],
userData: Buffer.from(
JSON.stringify({
time: [2, 0], // process.hrtime()
}),
),
}),
},
];
const assignment = await assigner.assign({ members, topics });
expect(assignment).to.deep.equal([
{
memberId: 'member-3',
memberAssignment: Kafka.AssignerProtocol.MemberAssignment.encode({
version: assigner.version,
assignment: {
'topic-A': [0, 4, 8, 12],
'topic-B': [0, 4],
},
userData: Buffer.alloc(0),
}),
},
{
memberId: 'member-1',
memberAssignment: Kafka.AssignerProtocol.MemberAssignment.encode({
version: assigner.version,
assignment: {
'topic-A': [1, 5, 9, 13],
'topic-B': [1],
},
userData: Buffer.alloc(0),
}),
},
{
memberId: 'member-4',
memberAssignment: Kafka.AssignerProtocol.MemberAssignment.encode({
version: assigner.version,
assignment: {
'topic-A': [2, 6, 10],
'topic-B': [2],
},
userData: Buffer.alloc(0),
}),
},
{
memberId: 'member-2',
memberAssignment: Kafka.AssignerProtocol.MemberAssignment.encode({
version: assigner.version,
assignment: {
'topic-A': [3, 7, 11],
'topic-B': [3],
},
userData: Buffer.alloc(0),
}),
},
]);
});
});
describe('protocol', () => {
it('returns the assigner name and metadata', () => {
expect(assigner.protocol({ topics })).to.deep.equal({
name: assigner.name,
metadata: Kafka.AssignerProtocol.MemberMetadata.encode({
version: assigner.version,
topics,
userData: Buffer.from(
JSON.stringify({
time: assigner.getTime(),
}),
),
}),
});
});
});
});

View File

@@ -21,7 +21,7 @@ describe('JsonSocket connection', () => {
new Promise(callback => {
clientSocket.sendMessage({ type: 'ping' }, callback);
}),
new Promise(callback => {
new Promise<void>(callback => {
clientSocket.on(MESSAGE_EVENT, (message: string) => {
expect(message).to.deep.equal({ type: 'pong' });
callback();
@@ -53,16 +53,16 @@ describe('JsonSocket connection', () => {
expect(clientSocket['isClosed']).to.equal(false);
expect(serverSocket['isClosed']).to.equal(false);
Promise.all([
new Promise(callback => {
new Promise<void>(callback => {
clientSocket.sendMessage(longPayload, callback);
}),
new Promise(callback => {
new Promise<void>(callback => {
clientSocket.on(MESSAGE_EVENT, (message: { type: 'pong' }) => {
expect(message).to.deep.equal({ type: 'pong' });
callback();
});
}),
new Promise(callback => {
new Promise<void>(callback => {
serverSocket.on(MESSAGE_EVENT, (message: { type: 'pong' }) => {
expect(message).to.deep.equal(longPayload);
serverSocket.sendMessage({ type: 'pong' }, callback);
@@ -85,7 +85,7 @@ describe('JsonSocket connection', () => {
return done(err);
}
Promise.all([
new Promise(callback =>
new Promise<void>(callback =>
Promise.all(
helpers
.range(1, 100)
@@ -97,7 +97,7 @@ describe('JsonSocket connection', () => {
),
).then(_ => callback()),
),
new Promise(callback => {
new Promise<void>(callback => {
let lastNumber = 0;
serverSocket.on(MESSAGE_EVENT, (message: { number: number }) => {
expect(message.number).to.deep.equal(lastNumber + 1);
@@ -128,7 +128,7 @@ describe('JsonSocket connection', () => {
})
.then(
() =>
new Promise(callback => {
new Promise<void>(callback => {
expect(clientSocket['isClosed']).to.equal(true);
expect(serverSocket['isClosed']).to.equal(true);
callback();
@@ -154,7 +154,7 @@ describe('JsonSocket connection', () => {
})
.then(
() =>
new Promise(callback => {
new Promise<void>(callback => {
expect(clientSocket['isClosed']).to.equal(true);
expect(serverSocket['isClosed']).to.equal(true);
callback();

View File

@@ -135,8 +135,8 @@ describe('ServerKafka', () => {
(server as any).consumer = consumer;
(server as any).producer = producer;
});
it('should close server', () => {
server.close();
it('should close server', async () => {
await server.close();
expect(consumer.disconnect.calledOnce).to.be.true;
expect(producer.disconnect.calledOnce).to.be.true;
@@ -229,7 +229,9 @@ describe('ServerKafka', () => {
replyPartition,
correlationId,
);
sendMessageStub = sinon.stub(server, 'sendMessage').callsFake(() => ({}));
sendMessageStub = sinon
.stub(server, 'sendMessage')
.callsFake(async () => []);
});
it(`should return function`, () => {
expect(typeof server.getPublisher(null, null, correlationId)).to.be.eql(
@@ -258,7 +260,7 @@ describe('ServerKafka', () => {
let getPublisherSpy: sinon.SinonSpy;
beforeEach(() => {
sinon.stub(server, 'sendMessage').callsFake(() => ({}));
sinon.stub(server, 'sendMessage').callsFake(async () => []);
getPublisherSpy = sinon.spy();
sinon.stub(server, 'getPublisher').callsFake(() => getPublisherSpy);

View File

@@ -39,7 +39,7 @@ export function AnyFilesInterceptor(
): Promise<Observable<any>> {
const ctx = context.switchToHttp();
await new Promise((resolve, reject) =>
await new Promise<void>((resolve, reject) =>
this.multer.any()(ctx.getRequest(), ctx.getResponse(), (err: any) => {
if (err) {
const error = transformException(err);

View File

@@ -43,7 +43,7 @@ export function FileFieldsInterceptor(
): Promise<Observable<any>> {
const ctx = context.switchToHttp();
await new Promise((resolve, reject) =>
await new Promise<void>((resolve, reject) =>
this.multer.fields(uploadFields)(
ctx.getRequest(),
ctx.getResponse(),

View File

@@ -40,7 +40,7 @@ export function FileInterceptor(
): Promise<Observable<any>> {
const ctx = context.switchToHttp();
await new Promise((resolve, reject) =>
await new Promise<void>((resolve, reject) =>
this.multer.single(fieldName)(
ctx.getRequest(),
ctx.getResponse(),

View File

@@ -41,7 +41,7 @@ export function FilesInterceptor(
): Promise<Observable<any>> {
const ctx = context.switchToHttp();
await new Promise((resolve, reject) =>
await new Promise<void>((resolve, reject) =>
this.multer.array(fieldName, maxCount)(
ctx.getRequest(),
ctx.getResponse(),

View File

@@ -1,6 +1,6 @@
{
"name": "@nestjs/platform-express",
"version": "7.5.5",
"version": "7.6.3",
"description": "Nest - modern, fast, powerful node.js web framework (@platform-express)",
"author": "Kamil Mysliwiec",
"license": "MIT",
@@ -24,8 +24,8 @@
"tslib": "2.0.3"
},
"devDependencies": {
"@nestjs/common": "7.5.5",
"@nestjs/core": "7.5.5"
"@nestjs/common": "7.6.3",
"@nestjs/core": "7.6.3"
},
"peerDependencies": {
"@nestjs/common": "^7.0.0",

View File

@@ -1,6 +1,6 @@
{
"name": "@nestjs/platform-fastify",
"version": "7.5.5",
"version": "7.6.3",
"description": "Nest - modern, fast, powerful node.js web framework (@platform-fastify)",
"author": "Kamil Mysliwiec",
"license": "MIT",
@@ -17,10 +17,10 @@
"access": "public"
},
"dependencies": {
"fastify": "3.9.1",
"fastify-cors": "5.0.0",
"fastify": "3.9.2",
"fastify-cors": "5.1.0",
"fastify-formbody": "5.0.0",
"light-my-request": "4.3.0",
"light-my-request": "4.4.1",
"middie": "5.2.0",
"path-to-regexp": "3.2.0",
"tslib": "2.0.3"

View File

@@ -1,6 +1,6 @@
{
"name": "@nestjs/platform-socket.io",
"version": "7.5.5",
"version": "7.6.3",
"description": "Nest - modern, fast, powerful node.js web framework (@platform-socket.io)",
"author": "Kamil Mysliwiec",
"license": "MIT",

View File

@@ -1,6 +1,6 @@
{
"name": "@nestjs/platform-ws",
"version": "7.5.5",
"version": "7.6.3",
"description": "Nest - modern, fast, powerful node.js web framework (@platform-ws)",
"author": "Kamil Mysliwiec",
"license": "MIT",

View File

@@ -1,6 +1,6 @@
{
"name": "@nestjs/testing",
"version": "7.5.5",
"version": "7.6.3",
"description": "Nest - modern, fast, powerful node.js web framework (@testing)",
"author": "Kamil Mysliwiec",
"license": "MIT",
@@ -22,6 +22,16 @@
},
"peerDependencies": {
"@nestjs/common": "^7.0.0",
"@nestjs/core": "^7.0.0"
"@nestjs/core": "^7.0.0",
"@nestjs/microservices": "^7.0.0",
"@nestjs/platform-express": "^7.0.0"
},
"peerDependenciesMeta": {
"@nestjs/microservices": {
"optional": true
},
"@nestjs/platform-express": {
"optional": true
}
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "@nestjs/websockets",
"version": "7.5.5",
"version": "7.6.3",
"description": "Nest - modern, fast, powerful node.js web framework (@websockets)",
"author": "Kamil Mysliwiec",
"license": "MIT",
@@ -16,12 +16,13 @@
"tslib": "2.0.3"
},
"devDependencies": {
"@nestjs/common": "7.5.5",
"@nestjs/core": "7.5.5"
"@nestjs/common": "7.6.3",
"@nestjs/core": "7.6.3"
},
"peerDependencies": {
"@nestjs/common": "^7.0.0",
"@nestjs/core": "^7.0.0",
"reflect-metadata": "^0.1.12",
"rxjs": "^6.0.0"
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -19,9 +19,9 @@
"test:e2e": "jest --config ./e2e/jest-e2e.json"
},
"dependencies": {
"@nestjs/common": "7.4.4",
"@nestjs/core": "7.4.4",
"@nestjs/platform-express": "7.4.4",
"@nestjs/common": "7.6.2",
"@nestjs/core": "7.6.2",
"@nestjs/platform-express": "7.6.2",
"class-transformer": "0.3.1",
"class-validator": "0.12.2",
"reflect-metadata": "0.1.13",
@@ -29,26 +29,26 @@
"rxjs": "6.6.3"
},
"devDependencies": {
"@nestjs/cli": "7.5.1",
"@nestjs/schematics": "7.2.1",
"@nestjs/testing": "7.4.4",
"@types/express": "4.17.8",
"@types/jest": "26.0.15",
"@types/node": "10.17.3",
"@nestjs/cli": "7.5.4",
"@nestjs/schematics": "7.2.5",
"@nestjs/testing": "7.6.2",
"@types/express": "4.17.9",
"@types/jest": "26.0.19",
"@types/node": "14.14.14",
"@types/supertest": "2.0.10",
"jest": "26.6.2",
"prettier": "2.1.2",
"supertest": "5.0.0",
"ts-jest": "26.4.1",
"ts-loader": "8.0.7",
"ts-node": "9.0.0",
"jest": "26.6.3",
"prettier": "2.2.1",
"supertest": "6.0.1",
"ts-jest": "26.4.4",
"ts-loader": "8.0.12",
"ts-node": "9.1.1",
"tsconfig-paths": "3.9.0",
"@typescript-eslint/eslint-plugin": "4.5.0",
"@typescript-eslint/parser": "4.5.0",
"eslint": "7.10.0",
"eslint-config-prettier": "6.11.0",
"@typescript-eslint/eslint-plugin": "4.10.0",
"@typescript-eslint/parser": "4.10.0",
"eslint": "7.15.0",
"eslint-config-prettier": "7.0.0",
"eslint-plugin-import": "2.22.1",
"typescript": "4.0.3"
"typescript": "4.1.3"
},
"jest": {
"moduleFileExtensions": [

File diff suppressed because it is too large Load Diff

View File

@@ -19,11 +19,11 @@
"test:e2e": "echo 'No e2e tests implemented yet.'"
},
"dependencies": {
"@nestjs/common": "7.4.4",
"@nestjs/core": "7.4.4",
"@nestjs/platform-express": "7.4.4",
"@nestjs/platform-socket.io": "7.4.4",
"@nestjs/websockets": "7.4.4",
"@nestjs/common": "7.6.2",
"@nestjs/core": "7.6.2",
"@nestjs/platform-express": "7.6.2",
"@nestjs/platform-socket.io": "7.6.2",
"@nestjs/websockets": "7.6.2",
"class-transformer": "0.3.1",
"class-validator": "0.12.2",
"reflect-metadata": "0.1.13",
@@ -32,27 +32,27 @@
"socket.io-redis": "5.4.0"
},
"devDependencies": {
"@types/socket.io": "2.1.11",
"@types/socket.io": "2.1.12",
"@types/socket.io-redis": "1.0.26",
"@types/ws": "7.2.9",
"@nestjs/cli": "7.5.1",
"@nestjs/schematics": "7.2.1",
"@nestjs/testing": "7.4.4",
"@types/express": "4.17.8",
"@types/node": "7.10.9",
"@types/ws": "7.4.0",
"@nestjs/cli": "7.5.4",
"@nestjs/schematics": "7.2.5",
"@nestjs/testing": "7.6.2",
"@types/express": "4.17.9",
"@types/node": "14.14.14",
"@types/supertest": "2.0.10",
"jest": "26.6.2",
"prettier": "2.1.2",
"supertest": "5.0.0",
"ts-jest": "26.4.1",
"ts-loader": "8.0.7",
"ts-node": "9.0.0",
"jest": "26.6.3",
"prettier": "2.2.1",
"supertest": "6.0.1",
"ts-jest": "26.4.4",
"ts-loader": "8.0.12",
"ts-node": "9.1.1",
"tsconfig-paths": "3.9.0",
"@typescript-eslint/eslint-plugin": "4.5.0",
"@typescript-eslint/parser": "4.5.0",
"eslint": "7.10.0",
"eslint-config-prettier": "6.11.0",
"@typescript-eslint/eslint-plugin": "4.10.0",
"@typescript-eslint/parser": "4.10.0",
"eslint": "7.15.0",
"eslint-config-prettier": "7.0.0",
"eslint-plugin-import": "2.22.1",
"typescript": "4.0.3"
"typescript": "4.1.3"
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -19,10 +19,10 @@
"test:e2e": "echo 'No e2e tests implemented yet.'"
},
"dependencies": {
"@nestjs/common": "7.4.4",
"@nestjs/core": "7.4.4",
"@nestjs/microservices": "7.4.4",
"@nestjs/platform-express": "7.4.4",
"@nestjs/common": "7.6.2",
"@nestjs/core": "7.6.2",
"@nestjs/microservices": "7.6.2",
"@nestjs/platform-express": "7.6.2",
"class-transformer": "0.3.1",
"class-validator": "0.12.2",
"reflect-metadata": "0.1.13",
@@ -30,25 +30,25 @@
"rxjs": "6.6.3"
},
"devDependencies": {
"@nestjs/cli": "7.5.1",
"@nestjs/schematics": "7.2.1",
"@nestjs/testing": "7.4.4",
"@nestjs/cli": "7.5.4",
"@nestjs/schematics": "7.2.5",
"@nestjs/testing": "7.6.2",
"@types/amqplib": "0.5.16",
"@types/express": "4.17.8",
"@types/node": "12.12.31",
"@types/express": "4.17.9",
"@types/node": "14.14.14",
"@types/supertest": "2.0.10",
"jest": "26.6.2",
"prettier": "2.1.2",
"supertest": "5.0.0",
"ts-jest": "26.4.1",
"ts-loader": "8.0.7",
"ts-node": "9.0.0",
"jest": "26.6.3",
"prettier": "2.2.1",
"supertest": "6.0.1",
"ts-jest": "26.4.4",
"ts-loader": "8.0.12",
"ts-node": "9.1.1",
"tsconfig-paths": "3.9.0",
"@typescript-eslint/eslint-plugin": "4.5.0",
"@typescript-eslint/parser": "4.5.0",
"eslint": "7.10.0",
"eslint-config-prettier": "6.11.0",
"@typescript-eslint/eslint-plugin": "4.10.0",
"@typescript-eslint/parser": "4.10.0",
"eslint": "7.15.0",
"eslint-config-prettier": "7.0.0",
"eslint-plugin-import": "2.22.1",
"typescript": "4.0.3"
"typescript": "4.1.3"
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -20,36 +20,36 @@
},
"dependencies": {
"@grpc/proto-loader": "0.5.5",
"@nestjs/common": "7.4.4",
"@nestjs/core": "7.4.4",
"@nestjs/microservices": "7.4.4",
"@nestjs/common": "7.6.2",
"@nestjs/core": "7.6.2",
"@nestjs/microservices": "7.6.2",
"class-transformer": "0.3.1",
"class-validator": "0.12.2",
"grpc": "1.24.3",
"grpc": "1.24.4",
"reflect-metadata": "0.1.13",
"rimraf": "3.0.2",
"rxjs": "6.6.3"
},
"devDependencies": {
"@nestjs/cli": "7.5.1",
"@nestjs/schematics": "7.2.1",
"@nestjs/testing": "7.4.4",
"@types/express": "4.17.8",
"@types/node": "10.17.3",
"@nestjs/cli": "7.5.4",
"@nestjs/schematics": "7.2.5",
"@nestjs/testing": "7.6.2",
"@types/express": "4.17.9",
"@types/node": "14.14.14",
"@types/supertest": "2.0.10",
"@types/ws": "7.2.9",
"jest": "26.6.2",
"prettier": "2.1.2",
"supertest": "5.0.0",
"ts-jest": "26.4.1",
"ts-loader": "8.0.7",
"ts-node": "9.0.0",
"@types/ws": "7.4.0",
"jest": "26.6.3",
"prettier": "2.2.1",
"supertest": "6.0.1",
"ts-jest": "26.4.4",
"ts-loader": "8.0.12",
"ts-node": "9.1.1",
"tsconfig-paths": "3.9.0",
"@typescript-eslint/eslint-plugin": "4.5.0",
"@typescript-eslint/parser": "4.5.0",
"eslint": "7.10.0",
"eslint-config-prettier": "6.11.0",
"@typescript-eslint/eslint-plugin": "4.10.0",
"@typescript-eslint/parser": "4.10.0",
"eslint": "7.15.0",
"eslint-config-prettier": "7.0.0",
"eslint-plugin-import": "2.22.1",
"typescript": "4.0.3"
"typescript": "4.1.3"
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -19,36 +19,36 @@
"test:e2e": "echo 'No e2e tests implemented yet.'"
},
"dependencies": {
"@nestjs/common": "7.4.4",
"@nestjs/core": "7.4.4",
"@nestjs/platform-express": "7.4.4",
"@nestjs/typeorm": "7.1.4",
"@nestjs/common": "7.6.2",
"@nestjs/core": "7.6.2",
"@nestjs/platform-express": "7.6.2",
"@nestjs/typeorm": "7.1.5",
"mysql": "2.18.1",
"reflect-metadata": "0.1.13",
"rimraf": "3.0.2",
"rxjs": "6.6.3",
"typeorm": "0.2.28"
"typeorm": "0.2.29"
},
"devDependencies": {
"@nestjs/cli": "7.5.1",
"@nestjs/schematics": "7.2.1",
"@nestjs/testing": "7.4.4",
"@types/express": "4.17.8",
"@types/node": "7.10.9",
"@nestjs/cli": "7.5.4",
"@nestjs/schematics": "7.2.5",
"@nestjs/testing": "7.6.2",
"@types/express": "4.17.9",
"@types/node": "14.14.14",
"@types/supertest": "2.0.10",
"@types/ws": "7.2.9",
"jest": "26.6.2",
"prettier": "2.1.2",
"supertest": "5.0.0",
"ts-jest": "26.4.1",
"ts-loader": "8.0.7",
"ts-node": "9.0.0",
"@types/ws": "7.4.0",
"jest": "26.6.3",
"prettier": "2.2.1",
"supertest": "6.0.1",
"ts-jest": "26.4.4",
"ts-loader": "8.0.12",
"ts-node": "9.1.1",
"tsconfig-paths": "3.9.0",
"@typescript-eslint/eslint-plugin": "4.5.0",
"@typescript-eslint/parser": "4.5.0",
"eslint": "7.10.0",
"eslint-config-prettier": "6.11.0",
"@typescript-eslint/eslint-plugin": "4.10.0",
"@typescript-eslint/parser": "4.10.0",
"eslint": "7.15.0",
"eslint-config-prettier": "7.0.0",
"eslint-plugin-import": "2.22.1",
"typescript": "4.0.3"
"typescript": "4.1.3"
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -19,36 +19,36 @@
"test:e2e": "echo 'No e2e tests implemented yet.'"
},
"dependencies": {
"@nestjs/common": "7.4.4",
"@nestjs/core": "7.4.4",
"@nestjs/mongoose": "7.0.2",
"@nestjs/platform-express": "7.4.4",
"mongoose": "5.10.10",
"@nestjs/common": "7.6.2",
"@nestjs/core": "7.6.2",
"@nestjs/mongoose": "7.2.0",
"@nestjs/platform-express": "7.6.2",
"mongoose": "5.11.8",
"reflect-metadata": "0.1.13",
"rimraf": "3.0.2",
"rxjs": "6.6.3"
},
"devDependencies": {
"@types/mongoose": "5.7.36",
"@nestjs/cli": "7.5.1",
"@nestjs/schematics": "7.2.1",
"@nestjs/testing": "7.4.4",
"@types/express": "4.17.8",
"@types/node": "12.12.31",
"@types/mongoose": "5.10.3",
"@nestjs/cli": "7.5.4",
"@nestjs/schematics": "7.2.5",
"@nestjs/testing": "7.6.2",
"@types/express": "4.17.9",
"@types/node": "14.14.14",
"@types/supertest": "2.0.10",
"@types/ws": "7.2.9",
"jest": "26.6.2",
"prettier": "2.1.2",
"supertest": "5.0.0",
"ts-jest": "26.4.1",
"ts-loader": "8.0.7",
"ts-node": "9.0.0",
"@types/ws": "7.4.0",
"jest": "26.6.3",
"prettier": "2.2.1",
"supertest": "6.0.1",
"ts-jest": "26.4.4",
"ts-loader": "8.0.12",
"ts-node": "9.1.1",
"tsconfig-paths": "3.9.0",
"@typescript-eslint/eslint-plugin": "4.5.0",
"@typescript-eslint/parser": "4.5.0",
"eslint": "7.10.0",
"eslint-config-prettier": "6.11.0",
"@typescript-eslint/eslint-plugin": "4.10.0",
"@typescript-eslint/parser": "4.10.0",
"eslint": "7.15.0",
"eslint-config-prettier": "7.0.0",
"eslint-plugin-import": "2.22.1",
"typescript": "4.0.3"
"typescript": "4.1.3"
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -19,39 +19,39 @@
"test:e2e": "echo 'No e2e tests implemented yet.'"
},
"dependencies": {
"@nestjs/common": "7.4.4",
"@nestjs/core": "7.4.4",
"@nestjs/platform-express": "7.4.4",
"@nestjs/common": "7.6.2",
"@nestjs/core": "7.6.2",
"@nestjs/platform-express": "7.6.2",
"@nestjs/sequelize": "0.1.1",
"mysql2": "2.2.5",
"reflect-metadata": "0.1.13",
"rimraf": "3.0.2",
"rxjs": "6.6.3",
"sequelize": "5.22.3",
"sequelize": "6.3.5",
"sequelize-typescript": "1.1.0",
"typescript": "4.0.3"
"typescript": "4.1.3"
},
"devDependencies": {
"@types/sequelize": "4.28.9",
"@nestjs/cli": "7.5.1",
"@nestjs/schematics": "7.2.1",
"@nestjs/testing": "7.4.4",
"@types/express": "4.17.8",
"@types/node": "12.12.31",
"@nestjs/cli": "7.5.4",
"@nestjs/schematics": "7.2.5",
"@nestjs/testing": "7.6.2",
"@types/express": "4.17.9",
"@types/node": "14.14.14",
"@types/supertest": "2.0.10",
"@types/ws": "7.2.9",
"jest": "26.6.2",
"prettier": "2.1.2",
"supertest": "5.0.0",
"ts-jest": "26.4.1",
"ts-loader": "8.0.7",
"ts-node": "9.0.0",
"@types/ws": "7.4.0",
"jest": "26.6.3",
"prettier": "2.2.1",
"supertest": "6.0.1",
"ts-jest": "26.4.4",
"ts-loader": "8.0.12",
"ts-node": "9.1.1",
"tsconfig-paths": "3.9.0",
"@typescript-eslint/eslint-plugin": "4.5.0",
"@typescript-eslint/parser": "4.5.0",
"eslint": "7.10.0",
"eslint-config-prettier": "6.11.0",
"@typescript-eslint/eslint-plugin": "4.10.0",
"@typescript-eslint/parser": "4.10.0",
"eslint": "7.15.0",
"eslint-config-prettier": "7.0.0",
"eslint-plugin-import": "2.22.1",
"typescript": "4.0.3"
"typescript": "4.1.3"
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -11,26 +11,26 @@
"test:e2e": "echo 'No e2e tests implemented yet.'"
},
"dependencies": {
"@nestjs/common": "7.5.1",
"@nestjs/core": "7.5.1",
"@nestjs/platform-express": "7.5.1",
"@nestjs/common": "7.6.2",
"@nestjs/core": "7.6.2",
"@nestjs/platform-express": "7.6.2",
"reflect-metadata": "0.1.13",
"rxjs": "6.6.3",
"typescript": "4.0.5"
"typescript": "4.1.3"
},
"devDependencies": {
"@nestjs/cli": "7.5.2",
"@nestjs/schematics": "7.2.1",
"@types/node": "14.14.7",
"@typescript-eslint/eslint-plugin": "4.6.1",
"@typescript-eslint/parser": "4.6.1",
"eslint": "7.12.1",
"eslint-config-prettier": "6.15.0",
"@nestjs/cli": "7.5.4",
"@nestjs/schematics": "7.2.5",
"@types/node": "14.14.14",
"@typescript-eslint/eslint-plugin": "4.10.0",
"@typescript-eslint/parser": "4.10.0",
"eslint": "7.15.0",
"eslint-config-prettier": "7.0.0",
"eslint-plugin-import": "2.22.1",
"start-server-webpack-plugin": "2.2.5",
"ts-loader": "8.0.9",
"ts-node": "9.0.0",
"webpack": "5.4.0",
"ts-loader": "8.0.12",
"ts-node": "9.1.1",
"webpack": "5.10.3",
"webpack-cli": "4.2.0",
"webpack-node-externals": "2.5.2"
}

File diff suppressed because it is too large Load Diff

View File

@@ -13,28 +13,28 @@
"test:e2e": "echo 'No e2e tests implemented yet.'"
},
"dependencies": {
"@nestjs/common": "7.4.4",
"@nestjs/core": "7.4.4",
"@nestjs/platform-express": "7.4.4",
"@nestjs/microservices": "7.4.4",
"@nestjs/websockets": "7.4.4",
"@nestjs/common": "7.6.2",
"@nestjs/core": "7.6.2",
"@nestjs/platform-express": "7.6.2",
"@nestjs/microservices": "7.6.2",
"@nestjs/websockets": "7.6.2",
"reflect-metadata": "0.1.13",
"rxjs": "6.6.3"
},
"devDependencies": {
"@babel/cli": "7.12.8",
"@babel/core": "7.12.9",
"@babel/node": "7.12.6",
"@babel/cli": "7.12.10",
"@babel/core": "7.12.10",
"@babel/node": "7.12.10",
"@babel/plugin-proposal-decorators": "7.12.1",
"@babel/plugin-transform-runtime": "7.12.1",
"@babel/preset-env": "7.12.7",
"@babel/register": "7.12.1",
"@babel/plugin-transform-runtime": "7.12.10",
"@babel/preset-env": "7.12.11",
"@babel/register": "7.12.10",
"@babel/runtime": "7.12.5",
"@nestjs/testing": "7.4.4",
"jest": "26.6.2",
"@nestjs/testing": "7.6.2",
"jest": "26.6.3",
"nodemon": "2.0.6",
"prettier": "2.1.2",
"supertest": "5.0.0"
"prettier": "2.2.1",
"supertest": "6.0.1"
},
"jest": {
"moduleFileExtensions": [

File diff suppressed because it is too large Load Diff

View File

@@ -19,9 +19,9 @@
"test:e2e": "echo 'No e2e tests implemented yet.'"
},
"dependencies": {
"@nestjs/common": "7.4.4",
"@nestjs/core": "7.4.4",
"@nestjs/platform-fastify": "7.4.4",
"@nestjs/common": "7.6.2",
"@nestjs/core": "7.6.2",
"@nestjs/platform-fastify": "7.6.2",
"class-transformer": "0.3.1",
"class-validator": "0.12.2",
"reflect-metadata": "0.1.13",
@@ -29,25 +29,25 @@
"rxjs": "6.6.3"
},
"devDependencies": {
"@nestjs/cli": "7.5.1",
"@nestjs/schematics": "7.2.1",
"@nestjs/testing": "7.4.4",
"@types/express": "4.17.8",
"@types/node": "12.12.31",
"@nestjs/cli": "7.5.4",
"@nestjs/schematics": "7.2.5",
"@nestjs/testing": "7.6.2",
"@types/express": "4.17.9",
"@types/node": "14.14.14",
"@types/supertest": "2.0.10",
"@types/ws": "7.2.9",
"jest": "26.6.2",
"prettier": "2.1.2",
"supertest": "5.0.0",
"ts-jest": "26.4.1",
"ts-loader": "8.0.7",
"ts-node": "9.0.0",
"@types/ws": "7.4.0",
"jest": "26.6.3",
"prettier": "2.2.1",
"supertest": "6.0.1",
"ts-jest": "26.4.4",
"ts-loader": "8.0.12",
"ts-node": "9.1.1",
"tsconfig-paths": "3.9.0",
"@typescript-eslint/eslint-plugin": "4.5.0",
"@typescript-eslint/parser": "4.5.0",
"eslint": "7.10.0",
"eslint-config-prettier": "6.11.0",
"@typescript-eslint/eslint-plugin": "4.10.0",
"@typescript-eslint/parser": "4.10.0",
"eslint": "7.15.0",
"eslint-config-prettier": "7.0.0",
"eslint-plugin-import": "2.22.1",
"typescript": "4.0.3"
"typescript": "4.1.3"
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -19,36 +19,36 @@
"test:e2e": "echo 'No e2e tests implemented yet.'"
},
"dependencies": {
"@nestjs/common": "7.4.4",
"@nestjs/core": "7.4.4",
"@nestjs/platform-express": "7.4.4",
"@nestjs/swagger": "4.6.1",
"@nestjs/common": "7.6.2",
"@nestjs/core": "7.6.2",
"@nestjs/platform-express": "7.6.2",
"@nestjs/swagger": "4.7.7",
"class-transformer": "0.3.1",
"class-validator": "0.12.2",
"reflect-metadata": "0.1.13",
"rimraf": "3.0.2",
"rxjs": "6.6.3",
"swagger-ui-express": "4.1.4"
"swagger-ui-express": "4.1.5"
},
"devDependencies": {
"@nestjs/cli": "7.5.1",
"@nestjs/schematics": "7.2.1",
"@nestjs/testing": "7.4.4",
"@types/express": "4.17.8",
"@types/node": "10.17.3",
"@nestjs/cli": "7.5.4",
"@nestjs/schematics": "7.2.5",
"@nestjs/testing": "7.6.2",
"@types/express": "4.17.9",
"@types/node": "14.14.14",
"@types/supertest": "2.0.10",
"jest": "26.6.2",
"prettier": "2.1.2",
"supertest": "5.0.0",
"ts-jest": "26.4.1",
"ts-loader": "8.0.7",
"ts-node": "9.0.0",
"jest": "26.6.3",
"prettier": "2.2.1",
"supertest": "6.0.1",
"ts-jest": "26.4.4",
"ts-loader": "8.0.12",
"ts-node": "9.1.1",
"tsconfig-paths": "3.9.0",
"@typescript-eslint/eslint-plugin": "4.5.0",
"@typescript-eslint/parser": "4.5.0",
"eslint": "7.10.0",
"eslint-config-prettier": "6.11.0",
"@typescript-eslint/eslint-plugin": "4.10.0",
"@typescript-eslint/parser": "4.10.0",
"eslint": "7.15.0",
"eslint-config-prettier": "7.0.0",
"eslint-plugin-import": "2.22.1",
"typescript": "4.0.3"
"typescript": "4.1.3"
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -19,39 +19,39 @@
"test:e2e": "echo 'No e2e tests implemented yet.'"
},
"dependencies": {
"@nestjs/common": "7.4.4",
"@nestjs/core": "7.4.4",
"@nestjs/graphql": "7.7.0",
"@nestjs/platform-express": "7.4.4",
"@nestjs/common": "7.6.2",
"@nestjs/core": "7.6.2",
"@nestjs/graphql": "7.9.1",
"@nestjs/platform-express": "7.6.2",
"apollo-server": "2.19.0",
"apollo-server-express": "2.19.0",
"class-transformer": "0.3.1",
"class-validator": "0.12.2",
"graphql": "15.3.0",
"graphql": "15.4.0",
"graphql-subscriptions": "1.1.0",
"reflect-metadata": "0.1.13",
"rimraf": "3.0.2",
"rxjs": "6.6.3"
},
"devDependencies": {
"@nestjs/cli": "7.5.1",
"@nestjs/schematics": "7.2.1",
"@nestjs/testing": "7.4.4",
"@types/express": "4.17.8",
"@types/node": "12.12.31",
"@nestjs/cli": "7.5.4",
"@nestjs/schematics": "7.2.5",
"@nestjs/testing": "7.6.2",
"@types/express": "4.17.9",
"@types/node": "14.14.14",
"@types/supertest": "2.0.10",
"jest": "26.6.2",
"prettier": "2.1.2",
"supertest": "5.0.0",
"ts-jest": "26.4.1",
"ts-loader": "8.0.7",
"ts-node": "9.0.0",
"jest": "26.6.3",
"prettier": "2.2.1",
"supertest": "6.0.1",
"ts-jest": "26.4.4",
"ts-loader": "8.0.12",
"ts-node": "9.1.1",
"tsconfig-paths": "3.9.0",
"@typescript-eslint/eslint-plugin": "4.5.0",
"@typescript-eslint/parser": "4.5.0",
"eslint": "7.10.0",
"eslint-config-prettier": "6.11.0",
"@typescript-eslint/eslint-plugin": "4.10.0",
"@typescript-eslint/parser": "4.10.0",
"eslint": "7.15.0",
"eslint-config-prettier": "7.0.0",
"eslint-plugin-import": "2.22.1",
"typescript": "4.0.3"
"typescript": "4.1.3"
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -19,35 +19,35 @@
"test:e2e": "echo 'No e2e tests implemented yet.'"
},
"dependencies": {
"@nestjs/common": "7.4.4",
"@nestjs/core": "7.4.4",
"@nestjs/platform-express": "7.4.4",
"@nestjs/typeorm": "7.1.4",
"mongodb": "3.6.2",
"@nestjs/common": "7.6.2",
"@nestjs/core": "7.6.2",
"@nestjs/platform-express": "7.6.2",
"@nestjs/typeorm": "7.1.5",
"mongodb": "3.6.3",
"reflect-metadata": "0.1.13",
"rimraf": "3.0.2",
"rxjs": "6.6.3",
"typeorm": "0.2.28"
"typeorm": "0.2.29"
},
"devDependencies": {
"@nestjs/cli": "7.5.1",
"@nestjs/schematics": "7.2.1",
"@nestjs/testing": "7.4.4",
"@types/express": "4.17.8",
"@types/node": "12.12.31",
"@nestjs/cli": "7.5.4",
"@nestjs/schematics": "7.2.5",
"@nestjs/testing": "7.6.2",
"@types/express": "4.17.9",
"@types/node": "14.14.14",
"@types/supertest": "2.0.10",
"jest": "26.6.2",
"prettier": "2.1.2",
"supertest": "5.0.0",
"ts-jest": "26.4.1",
"ts-loader": "8.0.7",
"ts-node": "9.0.0",
"jest": "26.6.3",
"prettier": "2.2.1",
"supertest": "6.0.1",
"ts-jest": "26.4.4",
"ts-loader": "8.0.12",
"ts-node": "9.1.1",
"tsconfig-paths": "3.9.0",
"@typescript-eslint/eslint-plugin": "4.5.0",
"@typescript-eslint/parser": "4.5.0",
"eslint": "7.10.0",
"eslint-config-prettier": "6.11.0",
"@typescript-eslint/eslint-plugin": "4.10.0",
"@typescript-eslint/parser": "4.10.0",
"eslint": "7.15.0",
"eslint-config-prettier": "7.0.0",
"eslint-plugin-import": "2.22.1",
"typescript": "4.0.3"
"typescript": "4.1.3"
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -19,33 +19,33 @@
"test:e2e": "echo 'No e2e tests implemented yet.'"
},
"dependencies": {
"@nestjs/common": "7.4.4",
"@nestjs/core": "7.4.4",
"@nestjs/platform-express": "7.4.4",
"mongoose": "5.10.10",
"@nestjs/common": "7.6.2",
"@nestjs/core": "7.6.2",
"@nestjs/platform-express": "7.6.2",
"mongoose": "5.11.8",
"reflect-metadata": "0.1.13",
"rimraf": "3.0.2",
"rxjs": "6.6.3"
},
"devDependencies": {
"@nestjs/cli": "7.5.1",
"@nestjs/schematics": "7.2.1",
"@nestjs/testing": "7.4.4",
"@types/express": "4.17.8",
"@types/node": "7.10.9",
"@nestjs/cli": "7.5.4",
"@nestjs/schematics": "7.2.5",
"@nestjs/testing": "7.6.2",
"@types/express": "4.17.9",
"@types/node": "14.14.14",
"@types/supertest": "2.0.10",
"jest": "26.6.2",
"prettier": "2.1.2",
"supertest": "5.0.0",
"ts-jest": "26.4.1",
"ts-loader": "8.0.7",
"ts-node": "9.0.0",
"jest": "26.6.3",
"prettier": "2.2.1",
"supertest": "6.0.1",
"ts-jest": "26.4.4",
"ts-loader": "8.0.12",
"ts-node": "9.1.1",
"tsconfig-paths": "3.9.0",
"@typescript-eslint/eslint-plugin": "4.5.0",
"@typescript-eslint/parser": "4.5.0",
"eslint": "7.10.0",
"eslint-config-prettier": "6.11.0",
"@typescript-eslint/eslint-plugin": "4.10.0",
"@typescript-eslint/parser": "4.10.0",
"eslint": "7.15.0",
"eslint-config-prettier": "7.0.0",
"eslint-plugin-import": "2.22.1",
"typescript": "4.0.3"
"typescript": "4.1.3"
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -19,9 +19,9 @@
"test:e2e": "echo 'No e2e tests implemented yet.'"
},
"dependencies": {
"@nestjs/common": "7.4.4",
"@nestjs/core": "7.4.4",
"@nestjs/platform-express": "7.4.4",
"@nestjs/common": "7.6.2",
"@nestjs/core": "7.6.2",
"@nestjs/platform-express": "7.6.2",
"hbs": "4.1.1",
"pug": "3.0.0",
"reflect-metadata": "0.1.13",
@@ -29,24 +29,24 @@
"rxjs": "6.6.3"
},
"devDependencies": {
"@nestjs/cli": "7.5.1",
"@nestjs/schematics": "7.2.1",
"@nestjs/testing": "7.4.4",
"@types/express": "4.17.8",
"@types/node": "8.10.58",
"@nestjs/cli": "7.5.4",
"@nestjs/schematics": "7.2.5",
"@nestjs/testing": "7.6.2",
"@types/express": "4.17.9",
"@types/node": "14.14.14",
"@types/supertest": "2.0.10",
"jest": "26.6.2",
"prettier": "2.1.2",
"supertest": "5.0.0",
"ts-jest": "26.4.1",
"ts-loader": "8.0.7",
"ts-node": "9.0.0",
"jest": "26.6.3",
"prettier": "2.2.1",
"supertest": "6.0.1",
"ts-jest": "26.4.4",
"ts-loader": "8.0.12",
"ts-node": "9.1.1",
"tsconfig-paths": "3.9.0",
"@typescript-eslint/eslint-plugin": "4.5.0",
"@typescript-eslint/parser": "4.5.0",
"eslint": "7.10.0",
"eslint-config-prettier": "6.11.0",
"@typescript-eslint/eslint-plugin": "4.10.0",
"@typescript-eslint/parser": "4.10.0",
"eslint": "7.15.0",
"eslint-config-prettier": "7.0.0",
"eslint-plugin-import": "2.22.1",
"typescript": "4.0.3"
"typescript": "4.1.3"
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -19,38 +19,38 @@
"test:e2e": "echo 'No e2e tests implemented yet.'"
},
"dependencies": {
"@nestjs/common": "7.4.4",
"@nestjs/core": "7.4.4",
"@nestjs/platform-express": "7.4.4",
"@nestjs/platform-ws": "7.4.4",
"@nestjs/websockets": "7.4.4",
"@nestjs/common": "7.6.2",
"@nestjs/core": "7.6.2",
"@nestjs/platform-express": "7.6.2",
"@nestjs/platform-ws": "7.6.2",
"@nestjs/websockets": "7.6.2",
"class-transformer": "0.3.1",
"class-validator": "0.12.2",
"rimraf": "3.0.2",
"reflect-metadata": "0.1.13",
"rxjs": "6.6.3",
"ws": "7.3.1"
"ws": "7.4.1"
},
"devDependencies": {
"@types/ws": "7.2.9",
"@nestjs/cli": "7.5.1",
"@nestjs/schematics": "7.2.1",
"@nestjs/testing": "7.4.4",
"@types/express": "4.17.8",
"@types/node": "12.12.31",
"@types/ws": "7.4.0",
"@nestjs/cli": "7.5.4",
"@nestjs/schematics": "7.2.5",
"@nestjs/testing": "7.6.2",
"@types/express": "4.17.9",
"@types/node": "14.14.14",
"@types/supertest": "2.0.10",
"jest": "26.6.2",
"prettier": "2.1.2",
"supertest": "5.0.0",
"ts-jest": "26.4.1",
"ts-loader": "8.0.7",
"ts-node": "9.0.0",
"jest": "26.6.3",
"prettier": "2.2.1",
"supertest": "6.0.1",
"ts-jest": "26.4.4",
"ts-loader": "8.0.12",
"ts-node": "9.1.1",
"tsconfig-paths": "3.9.0",
"@typescript-eslint/eslint-plugin": "4.5.0",
"@typescript-eslint/parser": "4.5.0",
"eslint": "7.10.0",
"eslint-config-prettier": "6.11.0",
"@typescript-eslint/eslint-plugin": "4.10.0",
"@typescript-eslint/parser": "4.10.0",
"eslint": "7.15.0",
"eslint-config-prettier": "7.0.0",
"eslint-plugin-import": "2.22.1",
"typescript": "4.0.3"
"typescript": "4.1.3"
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -19,38 +19,38 @@
"test:e2e": "echo 'No e2e tests implemented yet.'"
},
"dependencies": {
"@nestjs/common": "7.4.4",
"@nestjs/core": "7.4.4",
"@nestjs/platform-fastify": "7.4.4",
"fastify-static": "3.2.1",
"@nestjs/common": "7.6.2",
"@nestjs/core": "7.6.2",
"@nestjs/platform-fastify": "7.6.2",
"fastify-static": "3.3.0",
"handlebars": "4.7.6",
"point-of-view": "4.6.0",
"point-of-view": "4.7.0",
"reflect-metadata": "0.1.13",
"rimraf": "3.0.2",
"rxjs": "6.6.3"
},
"devDependencies": {
"@types/socket.io": "2.1.11",
"@types/socket.io": "2.1.12",
"@types/socket.io-redis": "1.0.26",
"@types/ws": "7.2.9",
"@nestjs/cli": "7.5.1",
"@nestjs/schematics": "7.2.1",
"@nestjs/testing": "7.4.4",
"@types/express": "4.17.8",
"@types/node": "8.10.58",
"@types/ws": "7.4.0",
"@nestjs/cli": "7.5.4",
"@nestjs/schematics": "7.2.5",
"@nestjs/testing": "7.6.2",
"@types/express": "4.17.9",
"@types/node": "14.14.14",
"@types/supertest": "2.0.10",
"jest": "26.6.2",
"prettier": "2.1.2",
"supertest": "5.0.0",
"ts-jest": "26.4.1",
"ts-loader": "8.0.7",
"ts-node": "9.0.0",
"jest": "26.6.3",
"prettier": "2.2.1",
"supertest": "6.0.1",
"ts-jest": "26.4.4",
"ts-loader": "8.0.12",
"ts-node": "9.1.1",
"tsconfig-paths": "3.9.0",
"@typescript-eslint/eslint-plugin": "4.5.0",
"@typescript-eslint/parser": "4.5.0",
"eslint": "7.10.0",
"eslint-config-prettier": "6.11.0",
"@typescript-eslint/eslint-plugin": "4.10.0",
"@typescript-eslint/parser": "4.10.0",
"eslint": "7.15.0",
"eslint-config-prettier": "7.0.0",
"eslint-plugin-import": "2.22.1",
"typescript": "4.0.3"
"typescript": "4.1.3"
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -19,29 +19,29 @@
"test:e2e": "echo 'No e2e tests implemented yet.'"
},
"dependencies": {
"@nestjs/common": "7.4.4",
"@nestjs/core": "7.4.4",
"@nestjs/common": "7.6.2",
"@nestjs/core": "7.6.2",
"reflect-metadata": "0.1.13",
"rimraf": "3.0.2",
"rxjs": "6.6.3"
},
"devDependencies": {
"@nestjs/cli": "7.5.1",
"@nestjs/schematics": "7.2.1",
"@types/node": "12.12.31",
"@nestjs/cli": "7.5.4",
"@nestjs/schematics": "7.2.5",
"@types/node": "14.14.14",
"@types/supertest": "2.0.10",
"jest": "26.6.2",
"prettier": "2.1.2",
"supertest": "5.0.0",
"ts-jest": "26.4.1",
"ts-loader": "8.0.7",
"ts-node": "9.0.0",
"jest": "26.6.3",
"prettier": "2.2.1",
"supertest": "6.0.1",
"ts-jest": "26.4.4",
"ts-loader": "8.0.12",
"ts-node": "9.1.1",
"tsconfig-paths": "3.9.0",
"@typescript-eslint/eslint-plugin": "4.5.0",
"@typescript-eslint/parser": "4.5.0",
"eslint": "7.10.0",
"eslint-config-prettier": "6.11.0",
"@typescript-eslint/eslint-plugin": "4.10.0",
"@typescript-eslint/parser": "4.10.0",
"eslint": "7.15.0",
"eslint-config-prettier": "7.0.0",
"eslint-plugin-import": "2.22.1",
"typescript": "4.0.3"
"typescript": "4.1.3"
}
}

File diff suppressed because it is too large Load Diff

Some files were not shown because too many files have changed in this diff Show More