cofacts

Month: 2020-07

2020-07-01

github 01:41:36

Comment on #202 Convert GraphQL userArticleLinks to cursor-based pagination

Rebase and merge 看起來不會有 merge commit 呢囧 我還是用最傳統的 merge 好了,比較能從 commit 追 PR

github 01:56:59

Comment on #201 Feature/163 insert data

<https://github.com/godgunman|@godgunman> 你484改完忘記push

github 01:59:13

#206 Add google analytics to LIFF

• Google analytics setup for LIFF • sends pageview on page change • sends user timing on LIFF load • record redirect count as redirect event

github 02:14:49

Comment on #206 Add google analytics to LIFF

*Pull Request Test Coverage Report for <https://coveralls.io/builds/31779780|Build 978>* • *0* of *0* changed or added relevant lines in *0* files are covered. • No unchanged relevant lines lost coverage. • Overall coverage remained the same at *98.379%* * * * * * * *:yellow_heart: - <https://coveralls.io|Coveralls>*

github 05:25:58

Comment on #270 Hotfix/minor bugs

user id is used to identify whether the current user added the category as well (so that they can remove those categories added by themselves)(in line 298), `disableVote` may be a bit confusing. it's good to extract the author check logic to upper component though

github 05:31:03

Comment on #270 Hotfix/minor bugs

as for the mutation part it's certainly difficult to put them in appropriate place, `useMutation` is a hook and it uses closure to get required params, this means that we should put the params and `useMutation` in the same function, and at the same time consist the calling order so that it won't break hook rules

mrorz 10:26:44
本日會議紀錄

最近的會議主要會在追蹤收尾進度,以及盡量 release 已經做好且測過的東西唷
另外還有 Review 要做的新東西
https://g0v.hackmd.io/@mrorz/cofacts-meeting-notes/%2FU-KmeEYJTVmtCFB-bqWfGg|https://g0v.hackmd.io/@mrorz/cofacts-meeting-notes/%2FU-KmeEYJTVmtCFB-bqWfGg
github 13:39:37

Comment on #269 Feature/reply editor toobar

1. `replyReference` does not look like a variable that is really needed. If we need this variable, please use `const` and replace `reply.text + '\n'` with `replyReference` instead. 2. Please remove unused comment. 3. Seems that we did not copy the reply reference to `ReferenceInput` yet?

github 13:39:37

Review on #269 Feature/reply editor toobar

The unit test looks clearer now :) Thanks! I think we are good to go after we implement reference copying on "Add this reply to my reply" button.

github 14:05:21

Review on #270 Hotfix/minor bugs

LGTM! I am seeing this warning when running `npm run dev`, not sure if it will cause errors: ``` Warning: fragment with name ArticleWithCategories already exists. graphql-tag enforces all fragment names across your application to be unique; read more about this in the docs: <http://dev.apollodata.com/core/fragments.html#unique-names> ``` Let's see it on staging.

github 14:26:03

#271 [Trivial] Fix trendline

*Before* <https://user-images.githubusercontent.com/108608/86210185-babaf000-bba6-11ea-9111-6c086ffce5b0.png|image> *After* <https://user-images.githubusercontent.com/108608/86210121-a37c0280-bba6-11ea-9a82-471e8e69d367.png|image>

github 15:30:21

Comment on #269 Feature/reply editor toobar

Thanks for adding `reference`. I think we should use `reply.reference` directly instead of composing from `reply.hyperlinks`. Sometimes the author of the reply may <https://old.cofacts.org/reply/ORLVmHEBrhVJn3LNJnmn|put some info in reply's reference field>, such as adding section title between hyperlinks, or provide description to URLs that is more concise to hyperlink titles.

mrorz 15:52:20
今晚大概要處理的事情有

- 追蹤結案進度
- 過趨勢圖表的 spec
- LINE bot release test
- Website release 複測 (上週 blocker 應該已經解了)
- LIFF API 變更討論
github 18:02:44

Review on #269 Feature/reply editor toobar

LGTM! Thanks for the fix, let's see it in staging :sunglasses:

github 19:03:55

#207 Feature/165 cron notify

<https://github.com/cofacts/rumors-line-bot/issues/165|#165>

github 19:06:55

Comment on #207 Feature/165 cron notify

*Pull Request Test Coverage Report for <https://coveralls.io/builds/31796635|Build 984>* • *26* of *44* *(59.09%)* changed or added relevant lines in *4* files are covered. • No unchanged relevant lines lost coverage. • Overall coverage decreased (*-4.7%*) to *93.679%* * * * * * * *:yellow_heart: - <https://coveralls.io|Coveralls>*

2020-07-02

mrorz 08:54:44
https://www.facebook.com/groups/linebot/permalink/2549701032027134/|https://www.facebook.com/groups/linebot/permalink/2549701032027134/ liff 可以用 npm 載入,還有 Typescript definition

facebook.com

Richard Lee

LIFF SDK 現在也可以用 NPM 安裝了。另外安裝 NPM SDK,開發環境還有型別跟自動補完支援,寫起來效率滿分!

2020-07-03

mrorz 01:39:12
上 production 囉
github 13:48:20

Comment on #207 Feature/165 cron notify

If we put cron job and web server together, there may be a case that if we have multiple instances for web server, the cron job will be triggered once for each instance, causing the job be run multiple times. I would suggest we just prepare a standalone NodeJS script that can be invoked via command line. On other environments, people can use native cron tab to trigger the script; and on Heroku, we can use <https://devcenter.heroku.com/articles/scheduler|Heroku scheduler> to schedule a run.

github 13:48:20

Comment on #207 Feature/165 cron notify

Since we already use `date-fns` in our application, I think we should use <https://date-fns.org/v2.14.0/docs/add|`date-fns/add`> directly, instead of implementing our own function.

github 13:48:20

Review on #207 Feature/165 cron notify

I haven't reviewed the core logic in determine who to notify. Just to provide some thought on how cron job can be defined and how user can be notified. I love the test cases, they helped me a lot when understanding how each utility function works :nerd_face:

github 13:48:20

Comment on #207 Feature/165 cron notify

I wonder if `push` is used anywhere yet?

github 13:48:20

Comment on #207 Feature/165 cron notify

I would suggest we send flex message with button when doing multicast, so that we can hide complex LIFF URLs from users using a button with URI action.

github 13:48:20

Comment on #207 Feature/165 cron notify

If there are multiple instances running the web server, this may cause the cron job being run once for each instance / process. I would suggest we just implement a NodeJS script that can be run using command line. On normal production environment, the devOp can use ordinary cron tab to trigger the job; on our production environment on Heroku, we can use <https://devcenter.heroku.com/articles/scheduler|Heroku scheduler>. In this case, we won't need `CronJob` library; we just need to make sure the script works when run on CLI, just like the <https://github.com/cofacts/rumors-api/blob/master/src/scripts/cleanupUrls.js#L136-L138|hyperlink cleaning job> in rumors-api.

github 13:51:42

Comment on #207 Feature/165 cron notify

no, currently we just use `multicast` and `notify`.

github 23:53:58

#272 hotfix: add category bug

Add user id field to CreateArticleCategory

2020-07-04

mrorz 16:13:32
是說我在看 @ggmhttps://github.com/cofacts/rumors-line-bot/pull/201/files@acerxp511https://github.com/cofacts/rumors-line-bot/pull/207/files 時發現:
• PR207 cron job 會假設所有 `UserArticleLink` 都有 `lastViewedAt`
• PR201 則有兩種建立 `UserArticleLink` 的方式:建立全新 article 時 `UserArticleLink` 就不會有 `lastViewedAt`,但若是查到現有文章,無論該文章是否有 reply,都會建立或更新 `UserArticleLink` 並且寫入 `lastViewedAt`
我想討論的是:是否需要強制 `UserArticleLink` 一定要有 `lastViewedAt` 欄位呢? (https://g0v.hackmd.io/eIeU2g86Tfu5VnLazNfUvQ|Related spec

Note: 無論如何 `UserArticleLink.createdAt` 是一定會有而且一但寫入就不會修改的,所以不在討論範圍。

g0v.hackmd.io

rumors-line-bot 過去傳過訊息 implementation - HackMD

github 16:36:23

Comment on #201 Feature/163 insert data

Why don't we just use mongo client for a find query? (Also for L118) The main point of my previous comment was because `UserSetting.findOrInsertByUserId` will mutate the database to make the test pass, even when the function under test (`singleUserHandler`) does not work as expected. It did not mean we should avoid using `mongoClient` entirely ._.

github 16:36:23

Review on #201 Feature/163 insert data

Thanks for updating the PR accordingly! Named a few suggestions and added conclusion from the discussion <https://g0v-tw.slack.com/archives/C2PPMRQGP/p1593850412237300|on Slack> today.

github 16:36:23

Comment on #201 Feature/163 insert data

There is already a static method called `find` in `UserArticleLink`, which is used by GraphQL resolver for `Query.userArticleLinks`. `find` was added before this branch previously branched out and is now in the file after your rebase. I think we should only leave one endpoint that lists user-article links of a user. I like the naming `findByUserId` because it is more clear (and considering that we need also to find by article id in <https://github.com/cofacts/rumors-line-bot/pull/207|#207>). But the pagination mechanism is also required by the GraphQL API. Would you merge the implementation of the two find methods into `findByUserId()`, remove `find()` and update the resolver (`graphql/resolvers/Query.js`) accordingly?

github 16:36:23

Comment on #201 Feature/163 insert data

Since `choosingReply` does not involve `UserArticleLink`, I think we should not include these in test file, or it may confuse readers thinking that `choosingReply` has used `UserArticleLink`.

github 16:36:23

Comment on #201 Feature/163 insert data

As discussed in slack today, we should use an API that also writes `lastViewedAt`. Also, please add `lastViewedAt` as a required field in `userArticleLink.json` to ensure that `lastViewedAt` always exists in DB.

github 16:36:23

Comment on #201 Feature/163 insert data

Actually there is another method called `findOrInsertByUserIdAndArticleId` in `UserArticleLink` model, doing almost the same thing as `updateTimestamps` does, but it's not used anywhere. I suggest we should leave only one of such method in `UserArticleLink`.

github 18:32:07

Comment on #207 Feature/165 cron notify

1. Accepting inputting multiple `articleIds` and search by `{articleId: {$in: articleIds}}` can provide better flexibility. It should work more efficiently as well. 2. Since this API still returns `UserArticleLink`s instead of users, the `UserList` in its naming is a bit confusing. suggest using something like `findByArticleIds`.

github 18:32:07

Comment on #207 Feature/165 cron notify

According to <https://g0v-tw.slack.com/archives/C2PPMRQGP/p1593850412237300|our discussion on slack>, all `userArticleLink` should have `lastViewedAt`. We may need to update these fixtures to match that.

github 18:32:07

Comment on #207 Feature/165 cron notify

I suggest we move some of the `articleReply`'s `createdAt` outside of the queried time range because • `ListArticle(repliedAt)` includes an article as long as it has _one_ `articleReply` within the time range, thus it is possible for its results having _some_ article-replies outside the time range • Including such possibility in the test case can better cover all scenarios • Currently there are already plenty of `articleReplies` in the fixture, but all of them in the same time range. Moving some of them outside the time range can increase diversity of fixture combinations.

github 18:32:07

Review on #207 Feature/165 cron notify

I have viewed the logic of cron jobs. Thanks for the contribution! As our <https://datastudio.google.com/u/0/reporting/18J8jZYumsoaCPBk9bdRd97GKvi_W5v-r/page/ckUQ|open data analytics> shows, sometimes there will be more than 100 articles replied in a day. I have made some suggestions to enhance robustness of cron jobs by processing in batches. Also there are some suggestions about diversity of test fixtures.

github 18:32:07

Comment on #207 Feature/165 cron notify

The combination of `Promise.all` and `....map()` will cause `UserSettings.findOrInsertByUserId` being invoked in parallel. If there are `N` users to notify, this will trigger `N` queries to database _at once_. This can be dangerous when a very popular article (&gt;100 user article links) receives a new reply. I would suggest • Divide `Object.keys(notificationList)` (user ids) into batches, size of each batch controlled by a constant. • Use for-loop with `await` in loop to make sure only 1 batch is processed at a time. • Create a new static method on `UserSettings` that reads user settings in batch and does not perform upsert. • use `findAll({userId: {$in: userIds}})` in the new method to receive batch of user settings • there is no point inserting `UserSettings` in cron jobs • skip a user when the user does not have `UserSettings` (it should never happen though, just in case)

github 18:32:07

Comment on #207 Feature/165 cron notify

Sending one notification for each user that receive notification may <https://developers.google.com/analytics/devguides/collection/analyticsjs/limits-quotas|hit this quota> if there are more than 20 users to send notification: &gt; Each gtag.js and analytics.js tracker object starts with 20 hits that are replenished at a rate of 2 hits per second I think for now we can just send one event to GA that records the number of notifications we sent this time (in <https://developers.google.com/analytics/devguides/collection/analyticsjs/events|event value> so that the number adds up when query for multiple days).

github 18:32:07

Comment on #207 Feature/165 cron notify

These comments will be sent as part of GraphQL. I think we can remove these comments because the variable name already implies the same meaning.

github 18:32:07

Comment on #207 Feature/165 cron notify

Suggest adding `status:NORMAL` to exclude deleted replies

github 18:32:08

Comment on #207 Feature/165 cron notify

For processing the articles, I would suggest 1. Use pagination to get _all_ articles that match the filter, not just first `N` articles. 2. Sequentially process each batch of `N` articles. Design APIs so that it can handle multiple articles at once. ``` // Async generator that gets a batch of articles with articleReply between `from` and `to`. // The generator encapsulates complex pagination logic so that the function using it can focus on // batch processing logic without worrying pagination. // async function* getArticlesInBatch(from, to) { // Get pageInfo outside the loop since it's expensive for rumors-api const { data: { ListArticles: { pageInfo: {lastCursor} } } } = await gql`...`({from, to}); let after = undefined; while(lastCursor !== after) { // Actually loads `edges` and process. const { data: { ListArticles } } = await gql`...`({from, to, after}); yield ListArticles.edges.map(({node}) =&gt; node); // next gql call should go after the last cursor of this page after = ListArticles.edges[ListArticles.edges.length - 1].cursor; } } async function getNotificationList(lastScannedAt, nowWithOffset) { const result = {}; // for loop ensures that only one batch will be processed at a time, so that we do not // make a bunch of queries to our MongoDB at once. // for await (const articles of getArticlesInBatch(lastScannedAt, nowWithOffset)) { // Process the batch // Make an API that processes a batch of articleIds, instead of only one ID at a time const userArticleLinks = await UserArticleLink.findByArticleIds(articles.map(({id}) =&gt; id)); userArticleLinks.forEach(/* Logic that populates result */) } return result; } ```

github 18:39:32

Comment on #201 Feature/163 insert data

Should we test if `unfollow` event changes `allowNewReplyUpdate` to false?

github 19:05:06

Comment on #272 hotfix: add category bug

Still can't remove a category added by myself from the article detail page. The category first seems to deleted, but re-appear after refresh. I am seeing this warning in console. Not sure if it is relevant. &gt; Warning: fragment with name ArticleWithCategories already exists. &gt; graphql-tag enforces all fragment names across your application to be unique; read more about &gt; this in the docs: <http://dev.apollodata.com/core/fragments.html#unique-names|http://dev.apollodata.com/core/fragments.html#unique-names> I found that • there is no `status` exist in fetched `articleCategories` in apollo cache. • when querying article's `articleCategories` we did not specify `(status: NORMAL)` in GraphQL. I guess that's probably why all `articleCategories` are fetched and displayed. Since the display logic actually do not involve `status`, I suggest specifying `(status: NORMAL)` when querying `articleCategories` in article detail page.

2020-07-05

github 02:34:11

Comment on #201 Feature/163 insert data

Hmm, because you say `using MongoDb commands` :joy::joy: . It's a bit confusing for me, but doing mongo CLI is also quick, then I made it haha

github 03:35:50

Comment on #201 Feature/163 insert data

How about we move `find()` into `src/database/models/base.js` ? So that all models can use `find()` just like they already have `findOneAndUpdate`. As a result, `findByUserId()` is for listing user-article links in UserArticleLink, and `find()` is for general use in all models.

ggm 05:30:50
我後來把 `updateTimestamp` 改名成 `upsertByUserIdAndArticleId` ,然後把 `findOrInsertByUserIdAndArticleId` 刪掉
github 10:59:12

Comment on #201 Feature/163 insert data

But mongodb already have a method named `find()`. I would like to see `findXXX` methods directly invoking methods of mongodb client. Adding one more layer to `find()` is a bit over-engineered to me.

github 11:10:22

Comment on #201 Feature/163 insert data

Agree adding `find()` to `Base` so that `Base` API is more consistent :woman-gesturing-ok:

ggm 12:53:18
是說我昨天犯蠢,我不小心 `git rebase master` 推上去,但我後來又有 `git rebase dev` 推回去,我有檢查一遍應該是沒有改壞,如果看到哪裡有怪異現象(?)可能就是我 rebase 沒改好
github 17:05:59

Review on #272 hotfix: add category bug

It's working properly now. Thanks for the fix!

github 17:24:38

#179 ClaimReview search API

Discussion: <https://g0v.hackmd.io/@mrorz/cofacts-meeting-notes/%2F%40mrorz%2FByKAmKITU|https://g0v.hackmd.io/@mrorz/cofacts-meeting-notes/%2F%40mrorz%2FByKAmKITU> &gt; orz: Google 有針對 ClaimReview 做一個專門的 search box 耶:<https://toolbox.google.com/factcheck/explorer|https://toolbox.google.com/factcheck/explorer> (預設只會顯示瀏覽器語言的 fact-check result) &gt; 好像也有 Search API: <https://toolbox.google.com/factcheck/apis|https://toolbox.google.com/factcheck/apis> &gt; Lucien: 要不要做在 API server &gt; 就能查完直接送進 DB &gt; 或者轉換為一種 reply

github 17:35:00

#273 Revamp deleted reply list in article page

Spec: <https://g0v.hackmd.io/dYT7zCPGQiKsTne7-kkybw|https://g0v.hackmd.io/dYT7zCPGQiKsTne7-kkybw> Directions: only show deleted article-reply to its author

mrorz 17:39:17
接下來我會把 Cofacts Next 上一階段的未竟事項開成票放在 issue 裡頭追蹤。

這裏希望麻煩 @stbb1025 把之前這份投影片裡提到的 enhancement 中,production 上已經達成的部份做上記號(綠色勾勾之類的),這樣比較方便我整理還沒做的東西 >“< 感謝感謝
https://docs.google.com/presentation/d/18VnEBMr9m-t81ppRwHcjbA1keltg-CIn7wUF9pu1oLo/edit

2020-07-06

mrorz 01:55:43
今天花了點時間把 https://datastudio.google.com/u/0/reporting/18J8jZYumsoaCPBk9bdRd97GKvi_W5v-r/page/Cy2P|Google data studio 跟 LINE 的 follower &amp; demography API 串起來囉!

Google Apps script 會把 Cofacts chatbot 的 LINE OA 資料透過開在 chatbot 上的 GraphQL endpoint 拉下來,然後寫進 https://docs.google.com/spreadsheets/d/1h9YAmi5i_dpYGYG5xUBBiW2Q_W0-ylJocb-MKY2nffc/edit|Google spreadsheethttps://datastudio.google.com/u/0/reporting/18J8jZYumsoaCPBk9bdRd97GKvi_W5v-r/page/Cy2P|Google data studio 再以上面的 spreadsheet 作為資料源來畫圖。

Follower count 每天凌晨更新,Demography 則每週三凌晨更新唷。
image.png
github 20:51:03

#180 Update README.md

add troubleshooting section for an error encountered during setup

github 20:54:43

#43 Update README.md

add configuration step

github 20:55:05

Comment on #180 Update README.md

<https://coveralls.io/builds/31882251|Coverage Status> Coverage remained the same at 86.826% when pulling *<https://github.com/cofacts/rumors-api/commit/b65fd038d3a7575e6835d297b21f357675c84f9f|b65fd03> on readme_grpc* into *<https://github.com/cofacts/rumors-api/commit/8ca66a996c9cb8e09da67c5d05e71d32e875ff77|8ca66a9> on master*.

github 21:26:52

Review on #43 Update README.md

Thanks for the info!

github 21:29:38

Review on #180 Update README.md

Thank you for adding troubleshooting info!

2020-07-07

ichieh 12:24:58
Hi Cofact 的大家~ 7 月大松終於回到實體中研院了,所以要找主持人~ 想問看看 Cofacts 這次能不能幫忙這個重要的任務!:laughing::laughing:
ichieh 16:19:31
當天九點到就可以啦~~~~
這邊有前幾次的主持外包心得可以參考 :laughing:
https://g0v.hackmd.io/@M3bjBcTXQ5i3BKGkYmRMiA/ryo6ijBGH

HackMD

HackMD - Collaborative Markdown Knowledge Base

## 大松主持會議 - 各專案外包主持小心得 ### 源起 揪松團想成為更為多中心化的組織,因此將每次大松的「例行主持」外包給其他單位。 ### 例行主持工作內容 0. 松前哈拉 1. 介紹新手教

github 21:37:40

#44 add schema for analytics

added schema for analytics per spec at <https://g0v.hackmd.io/0kjaVlFASSyddkltqGR6Nw|https://g0v.hackmd.io/0kjaVlFASSyddkltqGR6Nw>

stbb1025 22:20:57
電腦版 top bar 我都加上了灰色分隔線
但是我覺得還是不加視覺上比較簡潔 一致 QQ
還是有辦法往下捲蓋到其他白色區塊的時候再出現灰色分隔線嗎?這樣會不會比較好?

@lucien @mrorz @yanglin5689446
截圖 2020-07-07 下午10.17.30.png
截圖 2020-07-07 下午10.18.07.png

2020-07-08

ba 09:37:03
@angel112811a has joined the channel
github 10:20:17

Review on #44 add schema for analytics

LGTM! Sorting imports and exports looks so organized. I think we can also include the `docUsrrId` as discussed in one of the comment <https://g0v.hackmd.io/0kjaVlFASSyddkltqGR6Nw|https://g0v.hackmd.io/0kjaVlFASSyddkltqGR6Nw>

github 15:55:31

#17 Move generated zip files to git-lfs instead

This should make pusing &amp; cloning this repo easier <https://github.blog/2018-07-30-git-lfs-2-5-0-now-available/|https://github.blog/2018-07-30-git-lfs-2-5-0-now-available/>

github 17:34:07

Review on #44 add schema for analytics

LGTM! Let's merge it

github 21:35:09

Comment on #166 Implement Google analytics stats field in Article

*Server side todo list* ☑︎ update schema <https://github.com/cofacts/rumors-db/pull/44|cofacts/rumors-db#44> ☐ fetch data from google analytics and store in elasticsearch ☑︎ oAuth &amp; google API ☐ insert/upsert data to elasticsearch &amp; query to populate docUserId ☐ unit test ☐ data loader: (article ids) -&gt; stats ☐ unit test ☐ new API endpoint to query for stats of a particular article ☐ unit test

2020-07-09

mrorz 02:26:30
LINE messaging API 現在連 ngrok 是不是怪怪的呀 @@
我的 ngrok HTTPS 可以用瀏覽器直連,但用 LINE 戳的時候連 request 都沒送進 ngrok⋯⋯
ngrok-verify-webhook.gif
github 03:52:54

#208 Update translation to avoid interpolation error

Fix release blocker found in 20200708 meeting <https://g0v.hackmd.io/ehzopA3fScSn9PjkBM79bA#%E2%9B%94%EF%B8%8F-Release-Blockers|https://g0v.hackmd.io/ehzopA3fScSn9PjkBM79bA#%E2%9B%94%EF%B8%8F-Release-Blockers> *Typo fix* <https://user-images.githubusercontent.com/108608/86963788-a2933400-c197-11ea-81a6-e2600db09577.png|image> *Translation* The original variable collides with other occurrence. Fixed translation by choosing a different variable name. Note: actually there are some other places that has variable name collision, but ttag still works there. Not sure it does not work this time. <https://user-images.githubusercontent.com/108608/86963620-65c73d00-c197-11ea-909c-cfed763cf979.png|image>

github 03:56:01

Comment on #208 Update translation to avoid interpolation error

*Pull Request Test Coverage Report for <https://coveralls.io/builds/31948667|Build 1018>* • *1* of *1* *(100.0%)* changed or added relevant line in *1* file are covered. • No unchanged relevant lines lost coverage. • Overall coverage remained the same at *98.379%* * * * * * * *:yellow_heart: - <https://coveralls.io|Coveralls>*

mrorz 12:15:42
/github subscribe list features
github 12:15:42

Subscribed to the following accounts

<https://github.com/cofacts|cofacts>

mrorz 12:16:06
/github subscribe list features