diff --git a/.gitignore b/.gitignore
deleted file mode 100644
index 69225a3..0000000
--- a/.gitignore
+++ /dev/null
@@ -1,5 +0,0 @@
-model/__pycache__
-out
-website/
-docs-minimind-v/
-测试.py
\ No newline at end of file
diff --git a/.readthedocs.yaml b/.readthedocs.yaml
new file mode 100644
index 0000000..0223969
--- /dev/null
+++ b/.readthedocs.yaml
@@ -0,0 +1,19 @@
+# Read the Docs 配置文件
+version: 2
+
+# 构建配置
+build:
+ os: ubuntu-22.04
+ tools:
+ python: "3.11"
+
+# MkDocs 配置
+mkdocs:
+ configuration: mkdocs.yml
+ fail_on_warning: false
+
+# Python 依赖
+python:
+ install:
+ - requirements: requirements.txt
+
diff --git a/CODE_OF_CONDUCT.md b/CODE_OF_CONDUCT.md
deleted file mode 100644
index f95d690..0000000
--- a/CODE_OF_CONDUCT.md
+++ /dev/null
@@ -1,128 +0,0 @@
-# Contributor Covenant Code of Conduct
-
-## Our Pledge
-
-We as members, contributors, and leaders pledge to make participation in our
-community a harassment-free experience for everyone, regardless of age, body
-size, visible or invisible disability, ethnicity, sex characteristics, gender
-identity and expression, level of experience, education, socio-economic status,
-nationality, personal appearance, race, religion, or sexual identity
-and orientation.
-
-We pledge to act and interact in ways that contribute to an open, welcoming,
-diverse, inclusive, and healthy community.
-
-## Our Standards
-
-Examples of behavior that contributes to a positive environment for our
-community include:
-
-* Demonstrating empathy and kindness toward other people
-* Being respectful of differing opinions, viewpoints, and experiences
-* Giving and gracefully accepting constructive feedback
-* Accepting responsibility and apologizing to those affected by our mistakes,
- and learning from the experience
-* Focusing on what is best not just for us as individuals, but for the
- overall community
-
-Examples of unacceptable behavior include:
-
-* The use of sexualized language or imagery, and sexual attention or
- advances of any kind
-* Trolling, insulting or derogatory comments, and personal or political attacks
-* Public or private harassment
-* Publishing others' private information, such as a physical or email
- address, without their explicit permission
-* Other conduct which could reasonably be considered inappropriate in a
- professional setting
-
-## Enforcement Responsibilities
-
-Community leaders are responsible for clarifying and enforcing our standards of
-acceptable behavior and will take appropriate and fair corrective action in
-response to any behavior that they deem inappropriate, threatening, offensive,
-or harmful.
-
-Community leaders have the right and responsibility to remove, edit, or reject
-comments, commits, code, wiki edits, issues, and other contributions that are
-not aligned to this Code of Conduct, and will communicate reasons for moderation
-decisions when appropriate.
-
-## Scope
-
-This Code of Conduct applies within all community spaces, and also applies when
-an individual is officially representing the community in public spaces.
-Examples of representing our community include using an official e-mail address,
-posting via an official social media account, or acting as an appointed
-representative at an online or offline event.
-
-## Enforcement
-
-Instances of abusive, harassing, or otherwise unacceptable behavior may be
-reported to the community leaders responsible for enforcement at
-.
-All complaints will be reviewed and investigated promptly and fairly.
-
-All community leaders are obligated to respect the privacy and security of the
-reporter of any incident.
-
-## Enforcement Guidelines
-
-Community leaders will follow these Community Impact Guidelines in determining
-the consequences for any action they deem in violation of this Code of Conduct:
-
-### 1. Correction
-
-**Community Impact**: Use of inappropriate language or other behavior deemed
-unprofessional or unwelcome in the community.
-
-**Consequence**: A private, written warning from community leaders, providing
-clarity around the nature of the violation and an explanation of why the
-behavior was inappropriate. A public apology may be requested.
-
-### 2. Warning
-
-**Community Impact**: A violation through a single incident or series
-of actions.
-
-**Consequence**: A warning with consequences for continued behavior. No
-interaction with the people involved, including unsolicited interaction with
-those enforcing the Code of Conduct, for a specified period of time. This
-includes avoiding interactions in community spaces as well as external channels
-like social media. Violating these terms may lead to a temporary or
-permanent ban.
-
-### 3. Temporary Ban
-
-**Community Impact**: A serious violation of community standards, including
-sustained inappropriate behavior.
-
-**Consequence**: A temporary ban from any sort of interaction or public
-communication with the community for a specified period of time. No public or
-private interaction with the people involved, including unsolicited interaction
-with those enforcing the Code of Conduct, is allowed during this period.
-Violating these terms may lead to a permanent ban.
-
-### 4. Permanent Ban
-
-**Community Impact**: Demonstrating a pattern of violation of community
-standards, including sustained inappropriate behavior, harassment of an
-individual, or aggression toward or disparagement of classes of individuals.
-
-**Consequence**: A permanent ban from any sort of public interaction within
-the community.
-
-## Attribution
-
-This Code of Conduct is adapted from the [Contributor Covenant][homepage],
-version 2.0, available at
-https://www.contributor-covenant.org/version/2/0/code_of_conduct.html.
-
-Community Impact Guidelines were inspired by [Mozilla's code of conduct
-enforcement ladder](https://github.com/mozilla/diversity).
-
-[homepage]: https://www.contributor-covenant.org
-
-For answers to common questions about this code of conduct, see the FAQ at
-https://www.contributor-covenant.org/faq. Translations are available at
-https://www.contributor-covenant.org/translations.
\ No newline at end of file
diff --git a/LICENSE b/LICENSE
deleted file mode 100644
index f49a4e1..0000000
--- a/LICENSE
+++ /dev/null
@@ -1,201 +0,0 @@
- Apache License
- Version 2.0, January 2004
- http://www.apache.org/licenses/
-
- TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
-
- 1. Definitions.
-
- "License" shall mean the terms and conditions for use, reproduction,
- and distribution as defined by Sections 1 through 9 of this document.
-
- "Licensor" shall mean the copyright owner or entity authorized by
- the copyright owner that is granting the License.
-
- "Legal Entity" shall mean the union of the acting entity and all
- other entities that control, are controlled by, or are under common
- control with that entity. For the purposes of this definition,
- "control" means (i) the power, direct or indirect, to cause the
- direction or management of such entity, whether by contract or
- otherwise, or (ii) ownership of fifty percent (50%) or more of the
- outstanding shares, or (iii) beneficial ownership of such entity.
-
- "You" (or "Your") shall mean an individual or Legal Entity
- exercising permissions granted by this License.
-
- "Source" form shall mean the preferred form for making modifications,
- including but not limited to software source code, documentation
- source, and configuration files.
-
- "Object" form shall mean any form resulting from mechanical
- transformation or translation of a Source form, including but
- not limited to compiled object code, generated documentation,
- and conversions to other media types.
-
- "Work" shall mean the work of authorship, whether in Source or
- Object form, made available under the License, as indicated by a
- copyright notice that is included in or attached to the work
- (an example is provided in the Appendix below).
-
- "Derivative Works" shall mean any work, whether in Source or Object
- form, that is based on (or derived from) the Work and for which the
- editorial revisions, annotations, elaborations, or other modifications
- represent, as a whole, an original work of authorship. For the purposes
- of this License, Derivative Works shall not include works that remain
- separable from, or merely link (or bind by name) to the interfaces of,
- the Work and Derivative Works thereof.
-
- "Contribution" shall mean any work of authorship, including
- the original version of the Work and any modifications or additions
- to that Work or Derivative Works thereof, that is intentionally
- submitted to Licensor for inclusion in the Work by the copyright owner
- or by an individual or Legal Entity authorized to submit on behalf of
- the copyright owner. For the purposes of this definition, "submitted"
- means any form of electronic, verbal, or written communication sent
- to the Licensor or its representatives, including but not limited to
- communication on electronic mailing lists, source code control systems,
- and issue tracking systems that are managed by, or on behalf of, the
- Licensor for the purpose of discussing and improving the Work, but
- excluding communication that is conspicuously marked or otherwise
- designated in writing by the copyright owner as "Not a Contribution."
-
- "Contributor" shall mean Licensor and any individual or Legal Entity
- on behalf of whom a Contribution has been received by Licensor and
- subsequently incorporated within the Work.
-
- 2. Grant of Copyright License. Subject to the terms and conditions of
- this License, each Contributor hereby grants to You a perpetual,
- worldwide, non-exclusive, no-charge, royalty-free, irrevocable
- copyright license to reproduce, prepare Derivative Works of,
- publicly display, publicly perform, sublicense, and distribute the
- Work and such Derivative Works in Source or Object form.
-
- 3. Grant of Patent License. Subject to the terms and conditions of
- this License, each Contributor hereby grants to You a perpetual,
- worldwide, non-exclusive, no-charge, royalty-free, irrevocable
- (except as stated in this section) patent license to make, have made,
- use, offer to sell, sell, import, and otherwise transfer the Work,
- where such license applies only to those patent claims licensable
- by such Contributor that are necessarily infringed by their
- Contribution(s) alone or by combination of their Contribution(s)
- with the Work to which such Contribution(s) was submitted. If You
- institute patent litigation against any entity (including a
- cross-claim or counterclaim in a lawsuit) alleging that the Work
- or a Contribution incorporated within the Work constitutes direct
- or contributory patent infringement, then any patent licenses
- granted to You under this License for that Work shall terminate
- as of the date such litigation is filed.
-
- 4. Redistribution. You may reproduce and distribute copies of the
- Work or Derivative Works thereof in any medium, with or without
- modifications, and in Source or Object form, provided that You
- meet the following conditions:
-
- (a) You must give any other recipients of the Work or
- Derivative Works a copy of this License; and
-
- (b) You must cause any modified files to carry prominent notices
- stating that You changed the files; and
-
- (c) You must retain, in the Source form of any Derivative Works
- that You distribute, all copyright, patent, trademark, and
- attribution notices from the Source form of the Work,
- excluding those notices that do not pertain to any part of
- the Derivative Works; and
-
- (d) If the Work includes a "NOTICE" text file as part of its
- distribution, then any Derivative Works that You distribute must
- include a readable copy of the attribution notices contained
- within such NOTICE file, excluding those notices that do not
- pertain to any part of the Derivative Works, in at least one
- of the following places: within a NOTICE text file distributed
- as part of the Derivative Works; within the Source form or
- documentation, if provided along with the Derivative Works; or,
- within a display generated by the Derivative Works, if and
- wherever such third-party notices normally appear. The contents
- of the NOTICE file are for informational purposes only and
- do not modify the License. You may add Your own attribution
- notices within Derivative Works that You distribute, alongside
- or as an addendum to the NOTICE text from the Work, provided
- that such additional attribution notices cannot be construed
- as modifying the License.
-
- You may add Your own copyright statement to Your modifications and
- may provide additional or different license terms and conditions
- for use, reproduction, or distribution of Your modifications, or
- for any such Derivative Works as a whole, provided Your use,
- reproduction, and distribution of the Work otherwise complies with
- the conditions stated in this License.
-
- 5. Submission of Contributions. Unless You explicitly state otherwise,
- any Contribution intentionally submitted for inclusion in the Work
- by You to the Licensor shall be under the terms and conditions of
- this License, without any additional terms or conditions.
- Notwithstanding the above, nothing herein shall supersede or modify
- the terms of any separate license agreement you may have executed
- with Licensor regarding such Contributions.
-
- 6. Trademarks. This License does not grant permission to use the trade
- names, trademarks, service marks, or product names of the Licensor,
- except as required for reasonable and customary use in describing the
- origin of the Work and reproducing the content of the NOTICE file.
-
- 7. Disclaimer of Warranty. Unless required by applicable law or
- agreed to in writing, Licensor provides the Work (and each
- Contributor provides its Contributions) on an "AS IS" BASIS,
- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
- implied, including, without limitation, any warranties or conditions
- of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
- PARTICULAR PURPOSE. You are solely responsible for determining the
- appropriateness of using or redistributing the Work and assume any
- risks associated with Your exercise of permissions under this License.
-
- 8. Limitation of Liability. In no event and under no legal theory,
- whether in tort (including negligence), contract, or otherwise,
- unless required by applicable law (such as deliberate and grossly
- negligent acts) or agreed to in writing, shall any Contributor be
- liable to You for damages, including any direct, indirect, special,
- incidental, or consequential damages of any character arising as a
- result of this License or out of the use or inability to use the
- Work (including but not limited to damages for loss of goodwill,
- work stoppage, computer failure or malfunction, or any and all
- other commercial damages or losses), even if such Contributor
- has been advised of the possibility of such damages.
-
- 9. Accepting Warranty or Additional Liability. While redistributing
- the Work or Derivative Works thereof, You may choose to offer,
- and charge a fee for, acceptance of support, warranty, indemnity,
- or other liability obligations and/or rights consistent with this
- License. However, in accepting such obligations, You may act only
- on Your own behalf and on Your sole responsibility, not on behalf
- of any other Contributor, and only if You agree to indemnify,
- defend, and hold each Contributor harmless for any liability
- incurred by, or claims asserted against, such Contributor by reason
- of your accepting any such warranty or additional liability.
-
- END OF TERMS AND CONDITIONS
-
- APPENDIX: How to apply the Apache License to your work.
-
- To apply the Apache License to your work, attach the following
- boilerplate notice, with the fields enclosed by brackets "[]"
- replaced with your own identifying information. (Don't include
- the brackets!) The text should be enclosed in the appropriate
- comment syntax for the file format. We also recommend that a
- file or class name and description of purpose be included on the
- same "printed page" as the copyright notice for easier
- identification within third-party archives.
-
- Copyright [yyyy] [name of copyright owner]
-
- Licensed under the Apache License, Version 2.0 (the "License");
- you may not use this file except in compliance with the License.
- You may obtain a copy of the License at
-
- http://www.apache.org/licenses/LICENSE-2.0
-
- Unless required by applicable law or agreed to in writing, software
- distributed under the License is distributed on an "AS IS" BASIS,
- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- See the License for the specific language governing permissions and
- limitations under the License.
\ No newline at end of file
diff --git a/README.md b/README.md
index 92f3b9d..f5274b5 100644
--- a/README.md
+++ b/README.md
@@ -1,801 +1 @@
-
-
-
-
-
-
-
-
-
-
-[](https://github.com/jingyaogong/minimind-v/stargazers)
-[](LICENSE)
-[](https://github.com/jingyaogong/minimind-v/commits/master)
-[](https://github.com/jingyaogong/minimind-v/pulls)
-[](https://huggingface.co/collections/jingyaogong/minimind-v-67000833fb60b3a2e1f3597d)
-
-
-
-
-
-
-
-
-
-
-
-
"大道至简"
-
-
-
-
-中文 | [English](./README_en.md)
-
-
-
-* 此项目旨在从0开始,仅用1.3块钱成本 + 1小时!即可训练出26M参数的超小多模态视觉语言模型**MiniMind-V**。
-* **MiniMind-V**最小版本体积仅为 GPT3 的约 $\frac{1}{7000}$,力求做到个人GPU也可快速推理甚至训练。
-* **MiniMind-V**是[MiniMind](https://github.com/jingyaogong/minimind)纯语言模型的视觉能力额外拓展。
-* 项目同时包含了VLM大模型的极简结构、数据集清洗、预训练(Pretrain)、监督微调(SFT)等全过程代码。
-* 这不仅是一个开源VLM模型的最小实现,也是入门视觉语言模型的简明教程。
-* 希望此项目能为所有人提供一个抛砖引玉的示例,一起感受创造的乐趣!推动更广泛AI社区的进步!
-
-> 为防止误解,“1小时” 基于NVIDIA 3090硬件设备(单卡)测试`1 epoch`,“1.3块钱” 指GPU服务器租用成本。
-
-
-
-
-
-
-
-[🔗🤖在线体验](https://www.modelscope.cn/studios/gongjy/MiniMind-V) | [🔗🎞️视频介绍](https://www.bilibili.com/video/BV1Sh1vYBEzY)
-
-
-
-# 📌 Introduction
-
-“用乐高拼出一架飞机,远比坐在头等舱里飞行更让人兴奋!”
-构建VLM范式的多模态大模型是否真的如想象中那样复杂?它的代码实现到底如何?
-训练过程究竟难不难?那么现在,探索它们的答案,一起感受创造的乐趣吧!
-
-> [!TIP]
-> (截至2025-02-20)MiniMind-V 系列已完成了以下型号模型训练,最小仅需26M (0.026B),即可具备识图和对话的能力!
-
-| 模型 (大小) | 推理占用 | release |
-|---------------------------|--------|------------|
-| MiniMind2-V (104M) | 0.6 GB | 2025.02.20 |
-| MiniMind2-Small-V (26M) | 1.1 GB | 2025.02.20 |
-| minimind-v-v1-small (27M) | 0.6 GB | 2024.10.04 |
-| minimind-v-v1 (109M) | 1.1 GB | 2024.10.04 |
-
-### 👉**最近更新**
-
-
- 2025-10-24
-
-- bug修复:模型权重不对应
-- 适配[「minimind-1024更新」](https://github.com/jingyaogong/minimind)
-- 代码重构:训练和评估脚本规范化
-- 新增完整的断点续训支持
-
-
-
-
- 2025-04-27
-
-- 兼容性更新
-- 适配[「minimind仓库新特性」](https://github.com/jingyaogong/minimind/issues/370)
-- 规范化部分代码
-
-
-
-
- 2025-02-20
-
-- MiniMind2-V伴随MiniMind2同步更新
-- 大幅减少所有冗余代码,规范代码格式
-- 大幅精简模型冗余结构
-- 更新数据集格式,拓展新的SFT数据集
-- 比前代VLM更优秀的效果!
-
-
-
-
-
- More...
-
-**2024-10-05**
-
-- MiniMind-V如期而至,首次开源
-
-
-
-# 📌 快速开始
-
-
-分享本人的软硬件配置(仅供参考)
-
-* CPU: Intel(R) Core(TM) i9-10980XE CPU @ 3.00GHz
-* RAM: 128 GB
-* GPU: NVIDIA GeForce RTX 3090(24GB) * 8
-* Ubuntu==20.04
-* CUDA==12.2
-* Python==3.10.16
-* [requirements.txt](./requirements.txt)
-
-
-
-### 第0步
-
-```bash
-# 克隆代码仓库
-git clone https://github.com/jingyaogong/minimind-v
-```
-
-```bash
-# 下载clip模型到 ./model/vision_model 目录下
-git clone https://huggingface.co/openai/clip-vit-base-patch16
-# or
-git clone https://www.modelscope.cn/models/openai-mirror/clip-vit-base-patch16
-```
-
-```bash
-# 下载minimind语言模型权重到 ./out 目录下(作为训练VLM的基座语言模型)
-# HuggingFace
-https://huggingface.co/jingyaogong/MiniMind2-V-PyTorch/blob/main/llm_512.pth # or llm_768.pth
-# 国内源
-https://modelscope.cn/models/gongjy/MiniMind2-V-PyTorch/resolve/master/llm_512.pth # or llm_768.pth
-```
-
-## Ⅰ 测试已有模型效果
-
-### 1.环境准备
-
-```bash
-pip install -r requirements.txt -i https://pypi.tuna.tsinghua.edu.cn/simple
-```
-
-### 2.下载模型
-
-```bash
-git clone https://huggingface.co/jingyaogong/MiniMind2-V
-```
-
-### 3.命令行问答
-
-```bash
-# load_from='model': 加载原生PyTorch权重, load_from='其他路径': 加载transformers格式
-python eval_vlm.py --load_from model --weight sft_vlm
-
-# 或使用transformers格式模型
-python eval_vlm.py --load_from MiniMind2-V
-```
-
-### 4.或启动WebUI
-
-```bash
-python web_demo_vlm.py
-```
-
-## Ⅱ 从0开始自己训练
-
-### 1.环境准备
-
-```bash
-pip install -r requirements.txt -i https://pypi.tuna.tsinghua.edu.cn/simple
-```
-
-
-注:提前测试Torch是否可用cuda
-
-```bash
-import torch
-print(torch.cuda.is_available())
-```
-
-如果不可用,请自行去[torch_stable](https://download.pytorch.org/whl/torch_stable.html)
-下载whl文件安装。参考[链接](https://blog.csdn.net/weixin_45456738/article/details/141029610?ops_request_misc=&request_id=&biz_id=102&utm_term=%E5%AE%89%E8%A3%85torch&utm_medium=distribute.pc_search_result.none-task-blog-2~all~sobaiduweb~default-2-141029610.nonecase&spm=1018.2226.3001.4187)
-
-
-
-### 2.数据下载
-
-从下文提供的[数据集链接](https://huggingface.co/datasets/jingyaogong/minimind-v_dataset)
-下载所需内容并放到`./dataset`下。
-
-
-注:数据集须知
-
-Pretrain数据:
-```bash
-wget https://hf-mirror.com/datasets/jingyaogong/minimind-v_dataset/resolve/main/pretrain_data.jsonl
-wget https://hf-mirror.com/datasets/jingyaogong/minimind-v_dataset/resolve/main/pretrain_images.zip
-unzip pretrain_images.zip && rm pretrain_images.zip
-```
-
-SFT数据:
-```bash
-wget https://hf-mirror.com/datasets/jingyaogong/minimind-v_dataset/resolve/main/sft_data.jsonl
-wget https://hf-mirror.com/datasets/jingyaogong/minimind-v_dataset/resolve/main/sft_images.zip
-unzip sft_images.zip && rm sft_images.zip
-```
-
-`*.jsonl`为问答文本,`*images`为配套的图片数据,下载完成后需要解压图像数据。
-
-请预留~5GB空间存放数据集,若无多余空间存放pretrain数据,可尝试跳过pretrain训练步骤直接进行sft训练。
-
-
-
-### 3.开始训练
-
-**3.1 预训练(学图像描述)**
-
-```bash
-# 基础训练命令(从LLM权重开始,仅训练vision_proj)
-python train_pretrain_vlm.py --epochs 4 --from_weight llm
-```
-
-> 执行预训练,得到 `pretrain_vlm_*.pth` 作为预训练的输出权重(其中*为模型的dimension,默认为512)
-
-
-**3.2 监督微调(学看图对话方式)**
-
-```bash
-# 基础训练命令(从预训练权重开始,全参数微调)
-python train_sft_vlm.py --epochs 2 --from_weight pretrain_vlm
-```
-
-> 执行监督微调,得到 `sft_vlm_*.pth` 作为指令微调的输出权重
-
-
-注:训练须知
-
-**训练特性:**
-- 支持断点续训:添加`--from_resume 1`参数可从上次中断处继续训练
-- 支持GPU数量变化:续训时GPU数量改变会自动转换step
-- 原子性保存:使用临时文件+替换机制,防止保存过程中断导致权重损坏
-- 每次保存同时生成`out/**.pth`(模型权重)和`checkpoints/**_resume.pth`(训练状态)文件
-
-```bash
-# 训练中断后,使用相同命令并添加 --from_resume 1
-python train_sft_vlm.py --epochs 4 --from_resume 1
-```
-
-**参数说明:**
-- `--from_weight`: 基础权重名称(llm, pretrain_vlm, none等)
-- `--save_weight`: 保存权重的前缀名
-- `--from_resume`: 是否续训(0=从头开始,1=从检查点继续)
-- `--freeze_llm`: 是否冻结LLM参数(仅pretrain使用)
-- 更多可直接参考代码
-
-
-
-
----
-
-### 4.测试模型效果
-
-确保需要测试的模型`*.pth`文件位于`./out/`目录下。
-也可以直接去[此处](https://huggingface.co/jingyaogong/MiniMind2-V-PyTorch)下载使用我训练的`*.pth`文件。
-
-```bash
-# 测试SFT模型(默认)
-python eval_vlm.py --weight sft_vlm
-
-# 测试Pretrain模型
-python eval_vlm.py --weight pretrain_vlm
-```
-
----
-
-> [!TIP]
-> 训练脚本均为Pytorch原生框架,均支持多卡加速,假设你的设备有N (N>1) 张显卡:
-
-单机N卡启动训练方式 (DDP, 支持多机多卡集群)
-
-```bash
-torchrun --nproc_per_node N train_xxx.py
-```
-
-
-注:其它须知
-
-
-单机N卡启动训练 (DeepSpeed)
-
-```bash
-deepspeed --master_port 29500 --num_gpus=N train_xxx.py
-```
-
-
-可根据需要开启wandb记录训练过程
-
-```bash
-# 需要登录: wandb login
-torchrun --nproc_per_node N train_xxx.py --use_wandb
-# and
-python train_xxx.py --use_wandb
-```
-
-通过添加`--use_wandb`参数,可以记录训练过程,训练完成后,可以在wandb网站上查看训练过程。通过修改`wandb_project`
-和`wandb_run_name`参数,可以指定项目名称和运行名称。
-
-【注】:25年6月后,国内网络环境无法直连WandB,MiniMind项目默认转为使用[SwanLab](https://swanlab.cn/)作为训练可视化工具(完全兼容WandB API),即`import wandb`改为`import swanlab as wandb`即可,其他均无需改动。
-
-
-
-# 📌 VLM Detail
-
-MiniMind-V (VLM)的基座语言模型MiniMind (LLM)来自孪生项目[minimind](https://github.com/jingyaogong/minimind),
-具体的模型结构、训练细节、原理、测试效果等均可移步[minimind](https://github.com/jingyaogong/minimind)项目查阅。
-此处为减少冗余,省略讨论LLM的相关部分,默认您已对MiniMind (LLM)的细节有基本的了解。
-
-> 即使您不太了解LLM的细节,也可参考“快速开始”流程训练一个MiniMind-V,
-> 这并不受到影响,仓库致力于最低成本的开箱即用!
-
-MiniMind-V的结构仅增加Visual Encoder和特征投影两个子模块,增加模态混合分支,以支持多种模态信息的输入:
-
-
-
-
-
- 【重要】一些有趣的思考
-
-此处不妨展开想一想两个问题:
-
-* 什么叫做**L**arge **L**anguage **M**odel (LLM)?
-* 什么叫做多模态模型?
-
-[这篇文章](https://www.jiqizhixin.com/articles/2024-09-15-3)完美吻合本人的想法:
-大语言模型(LLM)名字虽然带有语言二字,但它们其实与语言关系不大,这只是历史问题,更确切的名字应该是自回归 Transformer
-或者其他。LLM 更多是一种统计建模的通用技术,它们主要通过自回归 Transformer 来模拟 token 流,而这些 token
-可以代表文本、图片、音频、动作选择、甚至是分子等任何东西。
-因此,只要能将问题转化为模拟一系列离散 token 的流程,理论上都可以应用 LLM 来解决。
-实际上,随着大型语言模型技术栈的日益成熟,我们可能会看到越来越多的问题被纳入这种建模范式。也就是说,问题固定在使用 LLM
-进行『下一个 token 的预测』,只是每个领域中 token 的用途和含义有所不同。
-
-[ZJU-LiXi老师](https://person.zju.edu.cn/xilics#694283)同样谈及过类似观点(原话大意如下):
-文本、视频、语音、动作等在人类看来属于「多模态」信号,但所谓的「模态」其实只是人类在信息存储方式上的一种分类概念。
-就像`.txt`和`.png`文件,虽然在视觉呈现和高级表现形式上有所不同,但它们本质上并没有根本区别。
-之所以出现「多模态」这个概念,仅仅是因为人类在不同的感知层面上对这些信号的分类需求。
-然而,对于机器来说,无论信号来自何种「模态」,最终它们都只是以一串二进制的「单模态」数字序列来呈现。
-机器并不会区分这些信号的模态来源,而只是处理和分析这些序列背后所承载的信息内容。
-
-个人认为**G**enerative **P**retrained **T**ransformer (GPT) 比 **L**arge **L**anguage **M**odel (LLM)更为贴切,
-因此本人表达上更习惯用"GPT"去代表LLM/VLM/类GPT架构的系列模型,而非为了蹭OpenAI的热度。
-
-至此,我们可以用一句话总结GPT的所作所为:
-
-GPT模型根据现有token预测输出下一个下下一个下下下一个token ...,直到模型输出结束符;此处的"token"其实并不需要一定是文本!
-
-```text
-> 对于LLM模型,如果需要理解"图片",我们只要把"图片"作为对一种特殊的从来没见过的"外国语言",通过"外语词典"翻译后即可作为特殊的语言输入LLM
-> 对于LLM模型,如果需要理解"音频",我们只要把"音频"作为对一种特殊的从来没见过的"外国语言",通过"外语词典"翻译后即可作为特殊的语言输入LLM
-> ...
-```
-
-**为了得到MiniMind-V,我们只需要完成这2件事即可:**
-
-1. 借助擅长翻译图片的 **"外语词典"** ,把图片从 **"外国语言"** 翻译为模型便于理解的 **"LLM语言"**
-2. 训练微调LLM,使其和 **"外语词典"** 度过磨合期,从而更好的理解图片
-
-"外语词典" 称之为Visual Encoder模型。
-和LlaVA、Qwen-VL等视觉语言模型类似,MiniMind-V同样选用开源Clip系列模型作为Visual Encoder。
-具体使用[clip-vit-base-patch16](https://huggingface.co/openai/clip-vit-base-patch16),
-一种基于 ViT-B/16 架构的经典Visual Encoder用于描述图像文本信息。
-输入的图像尺寸为224x224,因为划分的Patch是16×16,所以会产生14*14=196个token作为encoder编码层的输入,
-最终产生1×768维的嵌入向量用于和文本对计算误差。
-我们并不需要最终嵌入表示,因此只取encoder层的输出,也就是VIT核心主干的输出特征即可。
-它拿到前一层维度196×768大小的特征,我们把它作为196个visual token输入MiniMind-V。
-与LLM的结合在获取图像encoder特征后,一方面需要把768维度的visual token对齐到LLM的文本token,
-另一方面,要将图像特征映射到与文本embedding相同的空间,即文本token和原生的视觉token需要磨合并不能直接地一视同仁,
-可以称之为跨模态的特征对齐。
-[LlaVA-1](https://arxiv.org/pdf/2304.08485)使用简单的无偏线性变换完成了这一操作,效果很不错,MiniMind-V同样如此。
-
-
-
-至此,MiniMind-V的内部结构变化已经呈现完毕。
-
-
-
-
----
-
-下面,我们简单讨论MiniMind-V的外部输入输出的变化。
-
-VLM的输入依然是一段文本,其中包含特殊的``占位符。
-在计算文本嵌入后,可以将图像编码器生成的向量投影到该占位符对应的嵌入部分,替换掉原先的占位符embedding。
-例如:
-
-```text
-\n这个图像中有什么内容?
-```
-
-在`minimind-v`中,使用196个字符组成的 `@@@...@@@`
-占位符代替图像,之所以是196个字符,前面有所提及:
-任何图像都被clip模型encoder为196×768维的token,
-因此`minimind-v`的prompt为:
-
-```text
-@@@......@@@\n这个图片描述的是什么内容?
-```
-
-计算完embedding和projection,并对图像部分token替换后整个计算过程到输出则和LLM部分没有任何区别。
-
-
-
-一次性多图的实现方法就是通过注入多个``图像占位符进行实现,不需要修改任何框架。
-
-
- 视频理解的拓展思路
-
-write by [@xinyanghuang7](https://github.com/xinyanghuang7)
-
-对于多模态大模型的视频理解能力,一个可行的思路是参考现有MiniCPM-V 2.6 进行视频理解的Python示例。
-主要思想是通过提取视频关键帧,而后进行多图推理。
-因此,如果希望在MiniMind-V中添加视频理解能力,可以在现有多图训练的基础上,参考此python脚本中对于关键帧的提取方法,而后加大训练文件中支持图片的数量。
-所支持的MAX_NUM_FRAMES越多,所消耗的显存越大。
-
-```text
-import torch
-from PIL import Image
-from transformers import AutoModel, AutoTokenizer
-from decord import VideoReader, cpu # pip install decord
-
-model = AutoModel.from_pretrained('openbmb/MiniCPM-V-2_6', trust_remote_code=True,
- attn_implementation='sdpa',
- torch_dtype=torch.bfloat16) # sdpa or flash_attention_2, no eager
-model = model.eval().cuda()
-tokenizer = AutoTokenizer.from_pretrained('openbmb/MiniCPM-V-2_6', trust_remote_code=True)
-
-MAX_NUM_FRAMES = 64 # if cuda OOM set a smaller number
-
-
-def encode_video(video_path):
- def uniform_sample(l, n):
- gap = len(l) / n
- idxs = [int(i * gap + gap / 2) for i in range(n)]
- return [l[i] for i in idxs]
-
- vr = VideoReader(video_path, ctx=cpu(0))
- sample_fps = round(vr.get_avg_fps() / 1) # FPS
- frame_idx = [i for i in range(0, len(vr), sample_fps)]
- if len(frame_idx) > MAX_NUM_FRAMES:
- frame_idx = uniform_sample(frame_idx, MAX_NUM_FRAMES)
- frames = vr.get_batch(frame_idx).asnumpy()
- frames = [Image.fromarray(v.astype('uint8')) for v in frames]
- print('num frames:', len(frames))
- return frames
-
-
-video_path = "video_test.mp4"
-frames = encode_video(video_path)
-question = "Describe the video"
-msgs = [
- {'role': 'user', 'content': frames + [question]},
-]
-
-# Set decode params for video
-params = {}
-params["use_image_id"] = False
-params["max_slice_nums"] = 2 # 如果cuda OOM且视频分辨率大于448*448可设为1
-
-answer = model.chat(
- image=None,
- msgs=msgs,
- tokenizer=tokenizer,
- **params
-)
-print(answer)
-```
-
-
-
-至此,`MiniMind-V`的所有细节已经呈现完毕。
-`MiniMind-V`的模型子类完全继承自`MiniMind`,
-仅基于后者做**最小**变更而产生,
-其核心算法改动`< 50行`,迁移难度极低。
-因此可能和`LlAVA`等模型细节可能存在区别,但思路完全统一。
-
-# 📌 Experiment
-
-## Ⅰ 数据集
-
-来源:[Chinese-LLaVA-Vision](https://huggingface.co/datasets/LinkSoul/Chinese-LLaVA-Vision-Instructions)
-包含约57万张预训练图像,来自CC-3M和COCO 2014;
-[llava-en-zh-300k](https://huggingface.co/datasets/BUAADreamer/llava-en-zh-300k)
-包含300k条指令微调数据和15万张图像。
-问答内容经过翻译,
-对中文支持更友好,进一步经过整理并`resize`。
-
-(pretrain_vlm_data.jsonl) 预训练数据集格式:
-
-```json lines
-{
- "conversations": [
- {
- "role": "user",
- "content": "提供给定图像的简要描述。\n"
- },
- {
- "role": "assistant",
- "content": "橄榄油是自由使用的健康成分。"
- }
- ],
- "image": "GCC_train_002582585.jpg"
-}
-```
-
-(sft_vlm_data.jsonl) 单图指令微调数据集格式:
-
-```json lines
-{
- "conversations": [
- {
- "role": "user",
- "content": "闹钟的位置对睡眠质量有什么影响?"
- },
- {
- "role": "assistant",
- "content": "把数字闹钟放在床头柜..."
- }
- ],
- "image": "train-00000-of-00001_image_0_0.jpg"
-}
-```
-
-(sft_vlm_data_multi.jsonl) 多图指令微调数据集格式:
-
-```json lines
-{
- "conversations": [
- {
- "role": "user",
- "content": "context: Source Image: Target Image: Instruction: What is the correct image edit instruction that can transfrom the source image to target image?"
- },
- {
- "role": "assistant",
- "content": "take the people out of the back in the photo. Remove the two people behind the woman in the white dress and the man in the blue suit. remove people behind the couple in the centre"
- }
- ],
- "image": "0.jpg, 1.jpg"
-}
-```
-
-
- 数据说明
-
-* 多图数据集规模相对较小且为英文对话,数据集仅包含两图对比的场景,因此微调效果有限,这里只提供一种参考思路。
-
-
-* `jsonl`均为文本指令,`images.zip`均为配套的图像数据(下载后需要解压)
-
-
-
-数据集下载地址:([ModelScope](https://www.modelscope.cn/datasets/gongjy/minimind-v_dataset) | [HuggingFace](https://huggingface.co/datasets/jingyaogong/minimind-v_dataset))
-
-## Ⅱ 训练
-
-> train_pretrain_vlm
-
-预训练从595K条数据集中学习图片的通用知识,比如鹿是鹿,狗是狗。
-
-> train_sft_vlm
-
-指令微调从300K条真实对话数据集中学习对图片提问的真实问答格式,更符合与人类的交流习惯。
-
-> train_sft_vlm
-
-多图微调提供demo:鸟类对比数据集,长度为13.6k的真实问答格式。
-
-训练时均冻结visual encoder也就是clip模型梯度,
-只训练Projection和LLM两部分。
-预训练中,只设置Projection和LLM的最后一层参数可学习。
-指令微调中,设置Projection和LLM的全部参数可学习。
-
-> 训练时间和Loss走势(仅供参考)
-
-Pretrain [512+8] & [768+16]
-
-
-SFT [512+8] & [768+16]
-
-
-## Ⅲ 模型权重
-
-(原生PyTorch`*.pth`权重文件) 下载地址:
-([ModelScope](https://www.modelscope.cn/models/gongjy/MiniMind2-V-PyTorch) | [HuggingFace](https://huggingface.co/jingyaogong/MiniMind2-V-PyTorch))
-
-(`Transformers`格式模型)
-下载地址:
-([ModelScope](https://www.modelscope.cn/profile/gongjy) | [HuggingFace](https://huggingface.co/collections/jingyaogong/minimind-v-67000833fb60b3a2e1f3597d))
-
-> 注:Transformers版本均为单图指令微调后的`MiniMind-V`模型
-
-# 📌 Test
-
-### 效果测试
-
-#### 单图对话
-
-
-
-
- 图片
- MiniMind2-V
- MiniMind2-V-Small
-
-
-
-
-
-
-
-
- 图中是一个繁忙的城市街道,一条长长的街道两旁都是高楼大厦。这条街上挤满了汽车、卡车和公共汽车,还有许多其他车辆在路上行驶。在街道上,可以看到许多汽车,有的在高速行驶,而其他的则停在街道一侧。此外还有一辆公交车也停在街道的右侧。街道上可以看到交通灯,表明这是一个繁忙的城市环境。
- 图中是一个繁忙的城市景象,有几辆汽车和一辆卡车行驶在城市街道上。可以看到许多交通信号灯,其中一些位于街道左侧,另一些则在右侧。可以看到有几个人在街上行走,其中一些人站得离街道更近一些,而另一些则距离较远。还有一个停车标志位于画面的左侧,暗示着城市环境。可以看到街道上有两辆汽车,一辆在右边,另一辆在左边,还有一辆在左边。这幅图像捕捉到了都市环境中典型的一天。
-
-
-
-
-
-
- 图片显示了一个宇航员的宇航员身穿宇航服,坐在一架大型航天飞机上。他们似乎正在进行一次宇航员登机或下机的旅程。在宇航员的身后,有一个火箭发射架,可能是用来支撑宇航员在旅程中的任务。此外,还有一架飞机停在机库附近,进一步表明这是一次航空展。在飞机的周围,还有一些人,但他们看起来离飞机很近。可以看到一个人站在飞机附近,可能正在观察或等待航天飞机准备起飞。
- 场景中,一名士兵戴着头盔站在一架大型飞机上。这架飞机似乎是一架军用军用飞机,似乎正准备登上一架飞机。另一个人则站在前面,可能正在观察飞行过程。在飞机周围,有几个人,其中一些站在左侧,另一些则站在右侧。他们似乎正在观看飞行员的表现。此外,还有一辆卡车停在靠近左侧的位置,可能是为了更具体地观察飞行过程。
-
-
-
-
-
-
- 图片中,一个女人坐在沙滩上,手里拿着一只白色的狗。她看起来像是个女人,坐在沙地上,看着她。一只狗也坐在她旁边,看起来很放松和舒适。海滩上散布着其他沙滩游客,有些人坐着,而另一些人则坐在更远的地方。背景中可以看到一艘船,这表明这是一个受欢迎的海滩旅游目的地。
- 两个人坐在海滩上,一边懒洋洋地躺在沙滩上,另一边则坐着。他们似乎正在享受海边时光。海滩上有几把椅子,其中一把靠近沙滩的左侧,另一把在中间。此外,还有一只狗躺在沙地上,为这个场景增添了一种放松的气氛。
-
-
-
-
-
-
- 照片捕捉到一幅美丽如画的大自然场景,背景是高山峦崖。在水边,一座巨大的喷泉横跨着水面,吸引着许多游客。水面上有几个人,他们或站或坐在喷泉周围,或站或坐。有些人可以看到他们在水中行走,而其他人则站在水边。总体而言,这幅画描绘的是一个美丽而宁静的环境,在那里人们可以欣赏到如画般的美景。
- 在一个美丽的蓝色天空下,一座巨大而巨大的白色瀑布上方悬挂着一只巨大的湿流水。这只瀑布位于一座山上,为整个场景增添了一种迷人而又宁静的气氛。在这幅图像的背景中,可以看到几艘船,其中一些靠近水边,其他的则离得较远。这些船只似乎正在为风景或户外活动做准备。
-
-
-
-
-
-
- 图中,一个男人坐在公园的长椅上,旁边是一把绿色椅子。他身边有一本打开的书,上面写着"读书"一句话,暗示他可能正在阅读。公园里有一张长椅和一张公园长椅,为周围的环境增添了几分生气。在公园的周围,有几辆汽车和一辆卡车,表明这是一个公共区域。此外,还可以看到一个人站在公园的不同位置上,可能是等着上路或过马路。
- 一个穿着短裤的老人坐在公园长椅上,周围是树木。他似乎正在读一本书,可能是在读书。背景中有一座长凳,为这个场景提供了充足的座位。在背景中,可以看到一把椅子和一张餐桌,这说明这个场景可能是在一个户外座位区,那里有椅子供人们坐下来放松。
-
-
-
-
-
-
- 图中,一只白色的棕熊坐在草地上,旁边是一只长着棕色斑点的大熊。这只熊看起来很害羞或顽皮,因为它正躺在草地上休息,看上去很放松。
- 在这幅图像中,一只棕色的熊正在草地上漫步。这只熊被放置在草地上,占据了画面的大部分空间。它似乎正在自然环境中行走,可能是在草地上。在背景中,有几棵树,为画面增添了自然元素。一只鸟在场景的中间附近飞翔,为画面增添了生气勃勃的气氛。
-
-
-
-
-
-
- 图片展示了一个漂亮的花瓶,里面插满了五颜六色的鲜花和花束。这些花束散落在整个花瓶中,给人一种赏心悦目的感觉。花瓶里插着五颜六色鲜花,创造出一种令人赏心悦目的景象。这些鲜花被摆放在一张桌子上,很可能是为了展示它们的美丽而摆放的。
- 场景中,一辆绿色和紫色相间的自行车停在一栋建筑旁边,它被放置在一棵大树旁。这辆自行车被摆放在附近,为这个场景增添了几分色彩。除了自行车外,还有一些其他的自行车,包括两个位于前景中的一个和位于背景中靠近中心位置的另一个。自行车的存在表明它可能是停在那里的。
-
-
-
-
-
-
- 图片中的女人穿着一件白色连衣裙,脚上还系着一条黑色网球裙。她正在表演一个网球比赛,很可能是在比赛中。在背景中可以看到几把椅子,可能是为了观众或其他观众的座位安排而摆放的。此外,还有一个长凳放在场景左侧,为人们提供了一个休息的地方。
- 一名身穿白色衣服的女子站在舞台上,手里拿着一只手拿着白色飞盘。她似乎正在参加一个舞台舞会或比赛。场景中还有其他几个人,其中一个站在舞台左侧,另一个站在右侧,第三个人则站在场地右侧。舞台上有几个观众,有的站着,有的坐着,还有一些站着。这看起来像是一场欢乐的节日或活动。
-
-
-
-
-#### 多图对话(效果十分有限)
-
-
-
-
- 图片1
- 图片2
- 512_sft_multi
- 768_sft_multi
-
-
-
-
-
-
- 这幅图像显示了一种鸟簸戮的场景:一个女人站在红绿相间的红绿相间的紫色鸟簸戴在她身上。女人站在红色的鸟簸戴在她身上,而她的翻领上的那只红鸟则站在她身后。
- 这两只鸟在同一片树林中飞翔,有的位于画面中心,而另一些则较小,形成了鲜明对比。这种鸟类的出现突出了其飞行能力和适应性,因为它们能够在树林中快速迅速移动。此外,两只鸟的位置不同,一个在图像的左边,另一个在右边,这表明它们在同一片树林中移动得很近。这种鸟类的自然行为也有助于区分这两种鸟类物种。
-
-
-
-
-### 效果小结:
-
-视觉信号对于LLM视作一种特殊的外语,
-因此“学习外语”的能力高低,
-很大程度上取决于LLM的能力。
-LLM性能越强,对应的VLM必然越强,此时效果增益会很明显。
-
-#### 未来值得改进的方面:
-
-```text
-> 更简单的Projection的跨模态特征对齐方式,相较于Cross-Attention可能处于劣势。
-> Clip模型可以尝试更大性能更强的large系列,用更具细粒度的token表征图像特征,目前仍粗糙。
-> 分辨率不高,理论上只有224×224(minimind-v数据集为节省空间,仅设定为128×128)。
-> ...
-```
-
-# 📌 Acknowledge
-
-> [!TIP]
-> 如果您觉得 `MiniMind-V`对您有所帮助,可以在 GitHub 上加一个⭐
-> 水平有限难免存在未知的纰漏,欢迎所有人在Issues交流指正或提交PR改进项目
-> 您的支持就是持续改进项目的动力,谢谢!
-
-## 🤝[贡献者](https://github.com/jingyaogong/minimind/graphs/contributors)
-
-
-
-
-
-## 😊鸣谢
-
-@xinyanghuang7 :
-🔗实现了完整的多图分支
-
-
- 参考链接 & 感谢以下优秀的论文或项目
-
-- 排名不分任何先后顺序
-- [LlaVA](https://arxiv.org/pdf/2304.08485)
-- [LlaVA-VL](https://arxiv.org/pdf/2310.03744)
-- [Chinese-LLaVA-Vision-Instructions](https://huggingface.co/datasets/LinkSoul/Chinese-LLaVA-Vision-Instructions)
-
-
-
-## 🫶支持者
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-# 🎓 Citation
-
-If you find MiniMind-V helpful in your research or work, please cite:
-
-```bibtex
-@misc{minimind,
- title={MiniMind-V: Train a Tiny VLM from scratch},
- author={Jingyao Gong},
- year={2024},
- howpublished={https://github.com/jingyaogong/minimind-v}
-}
-```
-
-# License
-
-This repository is licensed under the [Apache-2.0 License](LICENSE).
+### MiniMind-V Docs
\ No newline at end of file
diff --git a/README_en.md b/README_en.md
deleted file mode 100644
index f9a468a..0000000
--- a/README_en.md
+++ /dev/null
@@ -1,834 +0,0 @@
-
-
-
-
-
-
-
-
-
-[](https://github.com/jingyaogong/minimind-v/stargazers)
-[](LICENSE)
-[](https://github.com/jingyaogong/minimind-v/commits/master)
-[](https://github.com/jingyaogong/minimind-v/pulls)
-[](https://huggingface.co/collections/jingyaogong/minimind-v-67000833fb60b3a2e1f3597d)
-
-
-
-
-
-
-
-
-
-
-
-
"The Greatest Path is the Simplest"
-
-
-
-
-[中文](./README.md) | English
-
-
-
-* This project aims to train a super-small multimodal vision-language model, **MiniMind-V**, with just a cost of 1.3 RMB
- and 1 hours of work, starting from scratch!
-* The smallest version of **MiniMind-V** is only about $\frac{1}{7000}$ the size of GPT-3, designed to enable fast
- inference and even training on personal GPUs.
-* **MiniMind-V** is an extension of the visual capabilities of the [MiniMind](https://github.com/jingyaogong/minimind)
- pure language model.
-* The project includes full code for the minimalist structure of large VLM models, dataset cleaning, pretraining, and
- supervised fine-tuning (SFT).
-* This is not only the smallest implementation of an open-source VLM model but also a concise tutorial for beginners in
- vision-language models.
-* The hope is that this project can provide a useful example to inspire others and share the joy of creation, helping to
- drive progress in the wider AI community!
-
-> To avoid misunderstandings, the "1 hours" is based on testing (`1 epoch`) with an NVIDIA 3090 hardware device (single GPU), and
-> the "1.3 RMB" refers to GPU server rental costs.
-
-
-
-
-
-[🔗🤖 Online Experience](https://www.modelscope.cn/studios/gongjy/MiniMind-V) | [🔗🎞️ Video Introduction](https://www.bilibili.com/video/BV1Sh1vYBEzY)
-
-
-
-# 📌 Introduction
-
-“Building a plane with Legos is much more exciting than flying in first class!”
-Is it really as complex as imagined to build a VLM-based multimodal large model? How is the code implementation done?
-Is the training process difficult? Now, let's explore the answers and feel the joy of creation together!
-
-> [!TIP]
-> (As of 2025-02-20) The MiniMind-V series has completed the training of the following model versions, with the smallest
-> requiring only 26M (0.026B) parameters, capable of both image recognition and conversation!
-
-| Model (Size) | Inference Memory | Release |
-|---------------------------|------------------|------------|
-| MiniMind2-V (104M) | 0.6 GB | 2025.02.20 |
-| MiniMind2-Small-V (26M) | 1.1 GB | 2025.02.20 |
-| minimind-v-v1-small (27M) | 0.6 GB | 2024.10.04 |
-| minimind-v-v1 (109M) | 1.1 GB | 2024.10.04 |
-
-### 👉**Recent Updates**
-
-
- 2025-10-24
-
-- Bug fix: model weights mismatch
-- Adapted to ["minimind-1024 update"](https://github.com/jingyaogong/minimind)
-- Code refactoring: training and evaluation scripts standardized
-- Added complete checkpoint resumption support
-
-
-
-
- 2025-04-27
-
-- Compatibility updates
-- Adapted to the new feature in the "minimind" repository
-- Standardized parts of the code
-
-
-
-
- 2025-02-20
-
-- MiniMind2-V updated alongside MiniMind2
-- Significant reduction of all redundant code, standardized code format
-- Major simplification of the model's redundant structure
-- Updated dataset format, expanded with new SFT datasets
-- Better performance than the previous VLM version!
-
-
-
-
-
- More...
-
-**2024-10-05**
-
-- MiniMind-V released on schedule, first open-source release
-
-
-
-# 📌 Quick Start
-
-
-Sharing my hardware and software configuration (for reference only)
-
-* CPU: Intel(R) Core(TM) i9-10980XE CPU @ 3.00GHz
-* RAM: 128 GB
-* GPU: NVIDIA GeForce RTX 3090(24GB) * 8
-* Ubuntu==20.04
-* CUDA==12.2
-* Python==3.10.16
-* [requirements.txt](./requirements.txt)
-
-
-
-### Step 0
-
-```bash
-# Clone the code repository
-git clone https://github.com/jingyaogong/minimind-v
-```
-
-```bash
-# Download the clip model to the ./model/vision_model directory
-git clone https://huggingface.co/openai/clip-vit-base-patch16
-# or
-git clone https://www.modelscope.cn/models/openai-mirror/clip-vit-base-patch16
-```
-
-```bash
-# Download the minimind language model to the ./out directory (as the base language model for training VLM):
-# HuggingFace
-https://huggingface.co/jingyaogong/MiniMind2-V-PyTorch/blob/main/llm_512.pth # or llm_768.pth
-# Domestic source
-https://modelscope.cn/models/gongjy/MiniMind2-V-PyTorch/resolve/master/llm_512.pth # or llm_768.pth
-```
-
-
-## Ⅰ Test an existing model's performance
-
-### 1. Environment Preparation
-
-```bash
-pip install -r requirements.txt -i https://mirrors.aliyun.com/pypi/simple
-```
-
-### 2. Download the model
-
-```bash
-git clone https://huggingface.co/jingyaogong/MiniMind2-V
-```
-
-### 3. Command-line Q&A
-
-```bash
-# load_from='model': load native PyTorch weights, load_from='other path': load transformers format
-python eval_vlm.py --load_from model --weight sft_vlm
-
-# Or use transformers format model
-python eval_vlm.py --load_from MiniMind2-V
-```
-
-### 4. Or start the WebUI
-
-```bash
-python web_demo_vlm.py
-```
-
-## Ⅱ Train from scratch
-
-### 1. Environment Preparation
-
-```bash
-pip install -r requirements.txt -i https://mirrors.aliyun.com/pypi/simple
-```
-
-
-Note: Test if Torch can use CUDA
-
-```bash
-import torch
-print(torch.cuda.is_available())
-```
-
-If unavailable, download the whl file from [torch_stable](https://download.pytorch.org/whl/torch_stable.html) for
-installation. Refer
-to [this link](https://blog.csdn.net/weixin_45456738/article/details/141029610?ops_request_misc=&request_id=&biz_id=102&utm_term=%E5%AE%89%E8%A3%85torch&utm_medium=distribute.pc_search_result.none-task-blog-2~all~sobaiduweb~default-2-141029610.nonecase&spm=1018.2226.3001.4187)
-for help.
-
-
-
-### 2. Download Data
-
-Download the required content from the [dataset link](https://huggingface.co/datasets/jingyaogong/minimind-v_dataset)
-and place it under `./dataset`.
-
-
-Note: Dataset Details
-
-Pretrain data:
-```bash
-wget https://hf-mirror.com/datasets/jingyaogong/minimind-v_dataset/resolve/main/pretrain_data.jsonl
-wget https://hf-mirror.com/datasets/jingyaogong/minimind-v_dataset/resolve/main/pretrain_images.zip
-unzip pretrain_images.zip && rm pretrain_images.zip
-```
-
-SFT data:
-```bash
-wget https://hf-mirror.com/datasets/jingyaogong/minimind-v_dataset/resolve/main/sft_data.jsonl
-wget https://hf-mirror.com/datasets/jingyaogong/minimind-v_dataset/resolve/main/sft_images.zip
-unzip sft_images.zip && rm sft_images.zip
-```
-
-`*.jsonl` contains Q&A text, and `*images` are the accompanying image data. After downloading, decompress the image data.
-
-Please reserve about 5GB of space for the dataset. If there is insufficient space for pretrain data, you can try skipping the pretrain training step and proceed directly to SFT training.
-
-
-
-### 3. Start Training
-
-**3.1 Pretraining (Learning image description)**
-
-```bash
-# Basic training command (start from LLM weights, train vision_proj only)
-python train_pretrain_vlm.py --epochs 4 --from_weight llm
-```
-
-> Run pretraining to get `pretrain_vlm_*.pth` as the pretrained model's output weights (* represents the model
-> dimension, default is 512).
-
-**3.2 Supervised Fine-Tuning (Learning image-caption dialogue style)**
-
-```bash
-# Basic training command (start from pretrain weights, full parameter fine-tuning)
-python train_sft_vlm.py --epochs 2 --from_weight pretrain_vlm
-```
-
-> Perform supervised fine-tuning to get `sft_vlm_*.pth` as the output weights for the fine-tuned model.
-
-
-Note: Training Details
-
-**Training Features:**
-- Support checkpoint resumption: add `--from_resume 1` parameter to continue from last interruption
-- Support GPU count changes: automatically convert steps when GPU count changes during resumption
-- Atomic saving: use temporary file + replacement mechanism to prevent weight corruption from interruption
-- Each save generates `out/**.pth` (model weights) and `checkpoints/**_resume.pth` (training state) files
-
-```bash
-# To resume training after interruption, use the same command and add --from_resume 1
-python train_sft_vlm.py --epochs 4 --from_resume 1
-```
-
-**Parameter Description:**
-- `--from_weight`: base weight name (llm, pretrain_vlm, none, etc.)
-- `--save_weight`: save weight prefix name
-- `--from_resume`: whether to resume training (0=start from scratch, 1=continue from checkpoint)
-- `--freeze_llm`: whether to freeze LLM parameters (pretrain use only)
-- More details can be found in the code
-
-
-
----
-
-### 4. Test the Model's Performance
-
-Ensure that the model `*.pth` file you want to test is located in the `./out/` directory.
-You can also directly download the pre-trained `*.pth` file
-from [here](https://huggingface.co/jingyaogong/MiniMind2-V-PyTorch).
-
-```bash
-# Test SFT model (default)
-python eval_vlm.py --weight sft_vlm
-
-# Test Pretrain model
-python eval_vlm.py --weight pretrain_vlm
-```
-
----
-
-> [!TIP]
-> The training scripts are based on PyTorch's native framework and support multi-card acceleration. If your device has
-> N (N>1) GPUs:
-
-Single-machine N-card training method (DDP, supports multi-machine multi-card cluster)
-
-```bash
-torchrun --nproc_per_node N train_xxx.py
-```
-
-
-Note: Other Details
-
-Single-machine N-card training (DeepSpeed)
-
-```bash
-deepspeed --master_port 29500 --num_gpus=N train_xxx.py
-```
-
-You can enable wandb logging during training:
-
-```bash
-# You need to log in: wandb login
-torchrun --nproc_per_node N train_xxx.py --use_wandb
-# and
-python train_xxx.py --use_wandb
-```
-
-By adding the `--use_wandb` parameter, you can log the training process, and after training is complete, you can view
-the process on the wandb website. You can specify the project name and run name by modifying the `wandb_project`
-and `wandb_run_name` parameters.
-
-[Note]: After June 2025, the domestic network environment cannot directly connect to WandB. The MiniMind project by default switches to using [SwanLab](https://swanlab.cn/) as the training visualization tool (fully compatible with WandB API), that is, just change `import wandb` to `import swanlab as wandb`, no other changes are needed.
-
-
-
-# 📌 VLM Detail
-
-The base language model of MiniMind-V (VLM), MiniMind (LLM), comes from the twin
-project [minimind](https://github.com/jingyaogong/minimind). For detailed information on the model structure, training
-specifics, principles, and testing results, please refer to the [minimind](https://github.com/jingyaogong/minimind)
-project. To reduce redundancy, the discussion on LLM-related topics is omitted here, assuming you have a basic
-understanding of MiniMind (LLM).
-
-> Even if you are not very familiar with the details of LLMs, you can still follow the "Quick Start" guide to train a
-> MiniMind-V, as it remains unaffected and the repository focuses on the lowest cost for out-of-the-box use!
-
-MiniMind-V's structure adds two submodules, a Visual Encoder and a feature projection, with a modality-mixing branch to
-support inputs from multiple modalities:
-
-
-
-
-
- [Important] Some Interesting Thoughts
-
-Let's take a moment to think about two questions:
-
-* What is a **Large Language Model (LLM)**?
-* What is a multimodal model?
-
-[This article](https://www.jiqizhixin.com/articles/2024-09-15-3) perfectly aligns with my thoughts:
-Although the name "large language model" (LLM) contains the word "language," they are actually not closely related to
-language; this is just a historical issue. A more accurate name would be self-regressive Transformer or something else.
-LLMs are more of a general statistical modeling technology, mainly using a self-regressive Transformer to simulate token
-flows. These tokens can represent text, images, audio, action choices, and even molecules—anything, really.
-Therefore, as long as the problem can be converted into a process of simulating a series of discrete tokens, LLM can
-theoretically solve it. In fact, with the increasing maturity of large language model technologies, we may see more and
-more problems falling under this modeling paradigm. In other words, the problem is fixed in using LLM to "predict the
-next token," but the role and meaning of the tokens differ in each domain.
-
-[ZJU-LiXi](https://person.zju.edu.cn/xilics#694283) has also mentioned a similar viewpoint (roughly stated below):
-Text, video, audio, actions, etc., are considered "multimodal" signals in human perception, but the term "modality" is
-essentially just a classification concept based on how humans store information. Just like `.txt` and `.png` files,
-though they differ in visual presentation and higher-level forms, they are fundamentally the same. The concept of "
-multimodal" arose simply because humans need to categorize these signals based on different sensory dimensions.
-However, for machines, regardless of the signal's "modality," they are ultimately presented as a sequence of binary "
-monomodal" numbers. Machines do not differentiate the origin of these signals; they just process and analyze the
-information contained within these sequences.
-
-Personally, I think **Generative Pretrained Transformer (GPT)** is a more fitting term than **Large Language Model (LLM)
-**, and I prefer to use "GPT" to represent models in the LLM/VLM/GPT-like architecture series rather than to ride on
-OpenAI's coattails.
-
-To summarize what GPTs do in one sentence:
-
-A GPT model predicts the next, next-next, next-next-next token, etc., based on the current token... until the model
-outputs the end token; here, the "token" doesn’t necessarily have to be text!
-
-```text
-> For an LLM model, if we need to understand an "image," we just treat the "image" as a special "foreign language" that has never been encountered before, and translate it into the "LLM language" via a "foreign language dictionary."
-> For an LLM model, if we need to understand "audio," we just treat "audio" as a special "foreign language" that has never been encountered before, and translate it into the "LLM language" via a "foreign language dictionary."
-> ...
-```
-
-**To obtain MiniMind-V, we only need to do these 2 things:**
-
-1. Use the **"foreign language dictionary"** that is good at translating images, to translate the image from the **"
- foreign language"** into a model-understandable **"LLM language."**
-2. Fine-tune the LLM so that it and the **"foreign language dictionary"** go through a period of adaptation, thereby
- better understanding images.
-
-The "foreign language dictionary" is referred to as the Visual Encoder model.
-Like LlaVA, Qwen-VL, and other visual language models, MiniMind-V also uses the open-source Clip series models as the
-Visual Encoder.
-Specifically, we use [clip-vit-base-patch16](https://huggingface.co/openai/clip-vit-base-patch16), a classic Visual
-Encoder based on the ViT-B/16 architecture for describing image-text information.
-The input image size is 224x224, and because the Patch size is 16×16, it generates 14*14=196 tokens as the input to the
-encoder layer, which produces a 1×768 dimensional embedding vector for calculating error with the text.
-We don’t need the final embedding representation, so we only take the output from the encoder layer, which is the output
-feature from the core ViT backbone.
-It receives the feature of size 196×768 from the previous layer, which we use as 196 visual tokens to input into
-MiniMind-V.
-After obtaining the image encoder features, the integration with the LLM requires aligning the 768-dimensional visual
-tokens with the LLM's text tokens, and mapping the image features into the same space as text embeddings. In other
-words, the image features and native visual tokens cannot be directly treated the same; they require cross-modal feature
-alignment.
-[LlaVA-1](https://arxiv.org/pdf/2304.08485) uses a simple unbiased linear transformation to achieve this, with great
-success, and MiniMind-V does the same.
-
-
-
-With that, the internal structural changes of MiniMind-V are now fully presented.
-
-
-
-
----
-
-Next, let's briefly discuss the changes in the external input and output of MiniMind-V.
-
-The input to the VLM is still a segment of text containing special `` placeholders.
-After computing the text embedding, the vector generated by the image encoder can be projected onto the corresponding
-embedding part of the placeholder, replacing the original placeholder embedding.
-For example:
-
-```text
-\nWhat is in this image?
-```
-
-In `minimind-v`, the image is replaced by a 196-character `@@@...@@@` placeholder. The reason for using 196 characters
-is explained earlier:
-Any image is encoded by the Clip model as 196×768-dimensional tokens,
-thus the `minimind-v` prompt becomes:
-
-```text
-@@@......@@@\nWhat is this image describing?
-```
-
-After calculating the embedding and projection, and replacing the image token part, the entire calculation process to
-output is no different from that of the LLM part.
-
-
-
-For handling multiple images at once, this can be achieved by injecting multiple `` placeholders without needing
-to modify the framework at all.
-
-
- Expansion Ideas for Video Understanding
-
-written by [@xinyanghuang7](https://github.com/xinyanghuang7)
-
-For the video understanding capabilities of multimodal large models, one feasible approach is to refer to the existing
-MiniCPM-V 2.6 Python example for video understanding.
-The main idea is to extract key frames from the video and then perform multi-image inference.
-Therefore, if you want to add video understanding capabilities to MiniMind-V, you can base it on the existing
-multi-image training, refer to the key frame extraction method in this Python script, and increase the number of images
-supported in the training files.
-The more MAX_NUM_FRAMES supported, the more GPU memory it will consume.
-
-```text
-import torch
-from PIL import Image
-from transformers import AutoModel, AutoTokenizer
-from decord import VideoReader, cpu # pip install decord
-
-model = AutoModel.from_pretrained('openbmb/MiniCPM-V-2_6', trust_remote_code=True,
- attn_implementation='sdpa',
- torch_dtype=torch.bfloat16) # sdpa or flash_attention_2, no eager
-model = model.eval().cuda()
-tokenizer = AutoTokenizer.from_pretrained('openbmb/MiniCPM-V-2_6', trust_remote_code=True)
-
-MAX_NUM_FRAMES = 64 # if cuda OOM set a smaller number
-
-
-def encode_video(video_path):
- def uniform_sample(l, n):
- gap = len(l) / n
- idxs = [int(i * gap + gap / 2) for i in range(n)]
- return [l[i] for i in idxs]
-
- vr = VideoReader(video_path, ctx=cpu(0))
- sample_fps = round(vr.get_avg_fps() / 1) # FPS
- frame_idx = [i for i in range(0, len(vr), sample_fps)]
- if len(frame_idx) > MAX_NUM_FRAMES:
- frame_idx = uniform_sample(frame_idx, MAX_NUM_FRAMES)
- frames = vr.get_batch(frame_idx).asnumpy()
- frames = [Image.fromarray(v.astype('uint8')) for v in frames]
- print('num frames:', len(frames))
- return frames
-
-
-video_path = "video_test.mp4"
-frames = encode_video(video_path)
-question = "Describe the video"
-msgs = [
- {'role': 'user', 'content': frames + [question]},
-]
-
-# Set decode params for video
-params = {}
-params["use_image_id"] = False
-params["max_slice_nums"] = 2 # If cuda OOM and video resolution is greater than 448*448, set to 1
-
-answer = model.chat(
- image=None,
- msgs=msgs,
- tokenizer=tokenizer,
- **params
-)
-print(answer)
-```
-
-
-
-At this point, all the details of `MiniMind-V` have been presented.
-The `MiniMind-V` model subclass completely inherits from `MiniMind`,
-and is generated with **minimal** changes based on the latter,
-with core algorithm modifications `< 50 lines`, making the migration difficulty very low.
-Therefore, there may be differences with models like `LlAVA`, but the overall idea remains consistent.
-
-# 📌 Experiment
-
-## Ⅰ Dataset
-
-Source: [Chinese-LLaVA-Vision](https://huggingface.co/datasets/LinkSoul/Chinese-LLaVA-Vision-Instructions)
-Contains approximately 570,000 pre-trained images from CC-3M and COCO 2014;
-[llava-en-zh-300k](https://huggingface.co/datasets/BUAADreamer/llava-en-zh-300k)
-Contains 300k instruction fine-tuning data and 150k images.
-The Q&A content has been translated, with better support for Chinese, further organized and resized.
-
-(pretrain_vlm_data.jsonl) Pre-training dataset format:
-
-```json lines
-{
- "conversations": [
- {
- "role": "user",
- "content": "Provide a brief description of the given image.\n"
- },
- {
- "role": "assistant",
- "content": "Olive oil is a healthy ingredient for free use."
- }
- ],
- "image": "GCC_train_002582585.jpg"
-}
-```
-
-(sft_vlm_data.jsonl) Single image instruction fine-tuning dataset format:
-
-```json lines
-{
- "conversations": [
- {
- "role": "user",
- "content": "What impact does the location of the alarm clock have on sleep quality?"
- },
- {
- "role": "assistant",
- "content": "Place the digital alarm clock on the nightstand..."
- }
- ],
- "image": "train-00000-of-00001_image_0_0.jpg"
-}
-```
-
-(sft_vlm_data_multi.jsonl) Multi-image instruction fine-tuning dataset format:
-
-```json lines
-{
- "conversations": [
- {
- "role": "user",
- "content": "context: Source Image: Target Image: Instruction: What is the correct image edit instruction that can transform the source image to target image?"
- },
- {
- "role": "assistant",
- "content": "take the people out of the back in the photo. Remove the two people behind the woman in the white dress and the man in the blue suit. remove people behind the couple in the center"
- }
- ],
- "image": "0.jpg, 1.jpg"
-}
-```
-
-
- Data Description
-
-* The multi-image dataset is relatively small and contains English conversations, focusing only on scenes with two image
- comparisons. Therefore, the fine-tuning effect is limited, and this is just one reference approach.
-
-* `jsonl` contains textual instructions, and `images.zip` contains the corresponding image data (to be unzipped after
- download).
-
-
-
-Dataset download
-link: ([ModelScope](https://www.modelscope.cn/datasets/gongjy/minimind-v_dataset) | [HuggingFace](https://huggingface.co/datasets/jingyaogong/minimind-v_dataset))
-
-## Ⅱ Training
-
-> train_pretrain_vlm
-
-Pre-training learns general image knowledge from a dataset of 595K samples, such as a deer is a deer, a dog is a dog.
-
-> train_sft_vlm
-
-Instruction fine-tuning learns the real Q&A format for image-related questions from a dataset of 300K real
-conversations, which better aligns with human communication habits.
-
-> train_sft_vlm
-
-Multi-image fine-tuning provides a demo: a bird comparison dataset with 13.6k real Q&A formats.
-
-During training, the visual encoder, i.e., the CLIP model's gradients, are frozen, and only the Projection and LLM parts
-are trained.
-In pre-training, only the last layer parameters of Projection and LLM are learnable.
-In instruction fine-tuning, all parameters of Projection and LLM are learnable.
-
-> Training Time and Loss Trend (for reference only)
-
-Pretrain [512+8] & [768+16]
-
-
-SFT [512+8] & [768+16]
-
-
-## Ⅲ Model Weights
-
-(Native PyTorch `*.pth` weight files) Download link:
-([ModelScope](https://www.modelscope.cn/models/gongjy/MiniMind2-V-PyTorch) | [HuggingFace](https://huggingface.co/jingyaogong/MiniMind2-V-PyTorch))
-
-(`Transformers` format models)
-Download link:
-([ModelScope](https://www.modelscope.cn/profile/gongjy) | [HuggingFace](https://huggingface.co/collections/jingyaogong/minimind-v-67000833fb60b3a2e1f3597d))
-
-> Note: The Transformers version is the `MiniMind-V` model after single-image instruction fine-tuning
-
-# 📌 Test
-
-### Effect Test
-
-#### Single Image Dialogue
-
-
-
-
- Image
- MiniMind2-V
- MiniMind2-V-Small
-
-
-
-
-
-
-
-
- The image shows a busy city street with tall buildings on both sides of a long road. The street is packed with cars, trucks, and buses, along with many other vehicles driving on the road. There are many cars visible on the street, some driving at high speed, while others are parked at the side. Additionally, there is a bus parked on the right side of the street. Traffic lights are visible on the street, indicating a busy city environment.
- The image depicts a bustling urban scene with several cars and a truck driving down the city street. There are many traffic lights visible, some on the left side of the street, and others on the right. Several pedestrians can be seen on the street, some standing closer to the road while others are farther away. A parking sign is located on the left side of the image, hinting at the city environment. Two cars are visible on the street, one on the right and the other on the left, with another one also on the left side. This image captures a typical day in a city environment.
-
-
-
-
-
-
- The image shows an astronaut in a spacesuit sitting in a large spacecraft. They appear to be embarking on or disembarking from a space mission. Behind the astronaut, there is a rocket launch platform, likely to support the astronaut's mission during the journey. Additionally, an aircraft is parked near a hangar, further suggesting this is an aviation event. There are a few people around the aircraft, some standing close, possibly observing or waiting for the spacecraft to prepare for takeoff.
- The scene shows a soldier wearing a helmet standing on a large airplane. This aircraft appears to be a military one, likely preparing to board another plane. Another person stands in front, possibly observing the flight process. There are several people around the airplane, some standing on the left side, others on the right. They seem to be watching the pilot's performance. Additionally, a truck is parked near the left side, likely to observe the flight process more closely.
-
-
-
-
-
-
- The image shows a woman sitting on the beach, holding a white dog in her hands. She appears to be a woman sitting on the sand, looking at her. A dog is also sitting beside her, looking relaxed and comfortable. Other beachgoers are scattered across the beach, some sitting, while others are farther away. A boat can be seen in the background, indicating that this is a popular beach destination for tourists.
- Two people are sitting on the beach, one lounging lazily on the sand, while the other is sitting. They seem to be enjoying their time by the sea. There are a few beach chairs, one close to the left side of the sand, another in the middle. Additionally, a dog is lying on the sand, adding to the relaxed atmosphere of the scene.
-
-
-
-
-
-
- The photo captures a beautiful natural scene with high mountains in the background. By the water, a large fountain spans across the surface, attracting many visitors. There are several people on the water's surface, some standing or sitting around the fountain, while others walk in the water. Overall, this image portrays a beautiful and peaceful environment where people can enjoy scenic views of nature.
- Under a beautiful blue sky, a massive white waterfall hangs above with a huge stream of wet water. This waterfall is located on a mountain, adding a charming and serene atmosphere to the whole scene. In the background of this image, several boats can be seen, some near the water's edge, others farther away. These boats seem to be preparing for scenic or outdoor activities.
-
-
-
-
-
-
- The image shows a man sitting on a park bench next to a green chair. There is an open book beside him with the words "reading" written on it, suggesting he may be reading. The park has a bench and a park seat, adding life to the surrounding environment. There are several cars and a truck nearby, indicating this is a public space. Additionally, a person can be seen standing at different locations in the park, possibly waiting to cross the road or walk further.
- An elderly person wearing shorts sits on a park bench surrounded by trees. He seems to be reading a book, possibly engaged in reading. In the background, there is another bench providing ample seating for the scene. You can also see a chair and a table in the background, suggesting this may be an outdoor seating area where people can relax.
-
-
-
-
-
-
- The image shows a white brown bear sitting on the grass, next to a large bear with brown spots. This bear seems shy or playful as it lies on the grass, resting and looking relaxed.
- In this image, a brown bear is strolling on the grass. The bear occupies much of the frame, seemingly walking in its natural environment on the grass. In the background, there are several trees, adding natural elements to the scene. A bird is flying near the middle of the scene, bringing a lively atmosphere to the image.
-
-
-
-
-
-
- The image shows a beautiful vase filled with colorful flowers and bouquets. The bouquets are scattered throughout the vase, creating a visually pleasant scene. The colorful flowers in the vase create a delightful visual. These flowers are placed on a table, likely displayed to showcase their beauty.
- The scene shows a green and purple bicycle parked next to a building, placed near a large tree. This bicycle is situated nearby, adding some color to the scene. In addition to the bicycle, there are other bicycles, including one in the foreground and another near the center of the background. The presence of the bicycles suggests they may be parked there.
-
-
-
-
-
-
- The woman in the image is wearing a white dress with a black tennis skirt. She is performing a tennis match, likely part of the competition. Several chairs are visible in the background, possibly set up for the audience or other spectators. Additionally, a bench is placed on the left side of the scene, providing a place for people to rest.
- A woman in white clothing stands on a stage holding a white frisbee. She seems to be participating in a stage dance or competition. Several other people are present in the scene, one standing on the left side of the stage, another on the right, and a third person standing near the right of the venue. The stage has several spectators, some standing, others sitting, with some remaining standing. This appears to be a joyful festival or event.
-
-
-
-
-#### Multiple Image Dialogue (Effect is Limited)
-
-
-
-
- Image1
- Image2
- 512_sft_multi
- 768_sft_multi
-
-
-
-
-
-
- This image displays a bird scenario: a woman standing with a red and green mixed purple bird perched on her. The woman stands with the bird on her shoulders, while the red bird on her collar stands behind her.
- The two birds are flying in the same forest, some are in the center of the image, while others are smaller, creating a contrast. The birds’ presence highlights their flight ability and adaptability as they swiftly move through the woods. Additionally, the birds’ positions vary, with one on the left and the other on the right, indicating they are moving close within the same forest. Their natural behavior helps distinguish the two bird species.
-
-
-
-
-### Effect Summary:
-
-Visual signals are treated as a special foreign language by LLMs, so the "language learning" ability highly depends on
-the LLM's capacity. The stronger the LLM, the more powerful the corresponding VLM, and the performance boost becomes
-significant.
-
-#### Future Areas for Improvement:
-
-```text
-> Simpler projection-based cross-modal feature alignment, which may be inferior compared to Cross-Attention.
-> The Clip model could try larger, more powerful large series for finer-grained token representations of image features, as they are still coarse.
-> The resolution is not high, theoretically only 224×224 (the minimind-v dataset is set to 128×128 for space saving).
-> ...
-```
-
-# 📌 Acknowledge
-
-> [!TIP]
-> If you find `MiniMind-V` helpful, please consider giving it a ⭐ on GitHub.
-> Given the limited expertise, there may be unknown issues, and we welcome everyone to discuss, correct, or submit PRs
-> to improve the project in Issues.
-> Your support is the driving force behind continuous improvements to the project. Thank you!
-
-## 🤝 [Contributors](https://github.com/jingyaogong/minimind/graphs/contributors)
-
-
-
-
-
-## 😊 Acknowledgments
-
-@xinyanghuang7 :
-🔗Implemented complete multi-graph branch
-
-
- Reference Links & Thanks to the following excellent papers or projects
-
-- No particular order
-- [LlaVA](https://arxiv.org/pdf/2304.08485)
-- [LlaVA-VL](https://arxiv.org/pdf/2310.03744)
-- [Chinese-LLaVA-Vision-Instructions](https://huggingface.co/datasets/LinkSoul/Chinese-LLaVA-Vision-Instructions)
-
-
-
-## 🫶Supporter
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-# 🎓 Citation
-
-If you find MiniMind-V helpful in your research or work, please cite:
-
-```bibtex
-@misc{minimind,
- title={MiniMind-V: Train a Tiny VLM from scratch},
- author={Jingyao Gong},
- year={2024},
- howpublished={https://github.com/jingyaogong/minimind-v}
-}
-```
-
-# License
-
-This repository is licensed under the [Apache-2.0 License](LICENSE).
-
diff --git a/dataset/__init__.py b/dataset/__init__.py
deleted file mode 100644
index e69de29..0000000
diff --git "a/dataset/eval_images/\345\237\216\345\270\202\350\275\246\346\260\264\351\251\254\351\276\231-city-traffic.jpg" "b/dataset/eval_images/\345\237\216\345\270\202\350\275\246\346\260\264\351\251\254\351\276\231-city-traffic.jpg"
deleted file mode 100755
index ede43c4..0000000
Binary files "a/dataset/eval_images/\345\237\216\345\270\202\350\275\246\346\260\264\351\251\254\351\276\231-city-traffic.jpg" and /dev/null differ
diff --git "a/dataset/eval_images/\345\244\252\347\251\272\345\256\207\350\210\252\345\221\230-Astronaut-Space.jpg" "b/dataset/eval_images/\345\244\252\347\251\272\345\256\207\350\210\252\345\221\230-Astronaut-Space.jpg"
deleted file mode 100755
index e2cca53..0000000
Binary files "a/dataset/eval_images/\345\244\252\347\251\272\345\256\207\350\210\252\345\221\230-Astronaut-Space.jpg" and /dev/null differ
diff --git "a/dataset/eval_images/\345\260\217\347\213\227\347\276\216\345\245\263\346\265\267\350\276\271-Dog-Woman-Sea.jpg" "b/dataset/eval_images/\345\260\217\347\213\227\347\276\216\345\245\263\346\265\267\350\276\271-Dog-Woman-Sea.jpg"
deleted file mode 100755
index 4b9aa27..0000000
Binary files "a/dataset/eval_images/\345\260\217\347\213\227\347\276\216\345\245\263\346\265\267\350\276\271-Dog-Woman-Sea.jpg" and /dev/null differ
diff --git "a/dataset/eval_images/\345\275\251\350\231\271\347\200\221\345\270\203-Rainbow-Falls.jpg" "b/dataset/eval_images/\345\275\251\350\231\271\347\200\221\345\270\203-Rainbow-Falls.jpg"
deleted file mode 100755
index 0ea9bf8..0000000
Binary files "a/dataset/eval_images/\345\275\251\350\231\271\347\200\221\345\270\203-Rainbow-Falls.jpg" and /dev/null differ
diff --git "a/dataset/eval_images/\346\244\205\345\255\220\350\200\201\344\272\272\347\234\213\344\271\246-Chair-Elderly-Reading.jpg" "b/dataset/eval_images/\346\244\205\345\255\220\350\200\201\344\272\272\347\234\213\344\271\246-Chair-Elderly-Reading.jpg"
deleted file mode 100755
index e2d86e3..0000000
Binary files "a/dataset/eval_images/\346\244\205\345\255\220\350\200\201\344\272\272\347\234\213\344\271\246-Chair-Elderly-Reading.jpg" and /dev/null differ
diff --git "a/dataset/eval_images/\347\206\212\347\214\253\350\215\211\345\234\260-Panda-Grassland.jpg" "b/dataset/eval_images/\347\206\212\347\214\253\350\215\211\345\234\260-Panda-Grassland.jpg"
deleted file mode 100755
index 9c6ba26..0000000
Binary files "a/dataset/eval_images/\347\206\212\347\214\253\350\215\211\345\234\260-Panda-Grassland.jpg" and /dev/null differ
diff --git "a/dataset/eval_images/\350\207\252\350\241\214\350\275\246\351\262\234\350\212\261-Bicycle-Flowers.jpg" "b/dataset/eval_images/\350\207\252\350\241\214\350\275\246\351\262\234\350\212\261-Bicycle-Flowers.jpg"
deleted file mode 100755
index 755da50..0000000
Binary files "a/dataset/eval_images/\350\207\252\350\241\214\350\275\246\351\262\234\350\212\261-Bicycle-Flowers.jpg" and /dev/null differ
diff --git "a/dataset/eval_images/\350\210\236\350\271\210-dance.jpg" "b/dataset/eval_images/\350\210\236\350\271\210-dance.jpg"
deleted file mode 100755
index 9095487..0000000
Binary files "a/dataset/eval_images/\350\210\236\350\271\210-dance.jpg" and /dev/null differ
diff --git a/dataset/eval_multi_images/bird/0.jpg b/dataset/eval_multi_images/bird/0.jpg
deleted file mode 100755
index 5a98a28..0000000
Binary files a/dataset/eval_multi_images/bird/0.jpg and /dev/null differ
diff --git a/dataset/eval_multi_images/bird/1.jpg b/dataset/eval_multi_images/bird/1.jpg
deleted file mode 100755
index e9b9815..0000000
Binary files a/dataset/eval_multi_images/bird/1.jpg and /dev/null differ
diff --git a/dataset/lm_dataset.py b/dataset/lm_dataset.py
deleted file mode 100644
index 3deeae0..0000000
--- a/dataset/lm_dataset.py
+++ /dev/null
@@ -1,86 +0,0 @@
-import json
-from PIL import Image
-from torch.utils.data import Dataset, DataLoader
-import torch
-from model.model_vlm import MiniMindVLM
-import os
-
-os.environ["TOKENIZERS_PARALLELISM"] = "false"
-
-
-class VLMDataset(Dataset):
- def __init__(self, jsonl_path, images_path, tokenizer, preprocess=None, max_length=512,
- image_special_token='@' * 196):
-
- super().__init__()
- self.samples = self.load_data(jsonl_path)
- self.images_path = images_path
-
- self.tokenizer = tokenizer
- self.max_length = max_length
- self.preprocess = preprocess
- self.image_token = image_special_token
- self.bos_id = tokenizer('<|im_start|>assistant', add_special_tokens=False).input_ids
- self.eos_id = tokenizer('<|im_end|>', add_special_tokens=False).input_ids
-
- def __len__(self):
- return len(self.samples)
-
- def load_data(self, path):
- samples = []
- with open(path, 'r', encoding='utf-8') as f:
- for line_num, line in enumerate(f, 1):
- data = json.loads(line.strip())
- samples.append(data)
- return samples
-
- def _create_chat_prompt(self, conversations):
- messages = []
- for i, turn in enumerate(conversations):
- role = 'user' if i % 2 == 0 else 'assistant'
- messages.append({"role": role, "content": turn['content'].replace('', self.image_token)})
- return self.tokenizer.apply_chat_template(
- messages,
- tokenize=False,
- add_generation_prompt=False
- )
-
- def _generate_loss_mask(self, input_ids):
- loss_mask = [0] * len(input_ids)
- i = 0
- while i < len(input_ids):
- if input_ids[i:i + len(self.bos_id)] == self.bos_id:
- start = i + len(self.bos_id)
- end = start
- while end < len(input_ids):
- if input_ids[end:end + len(self.eos_id)] == self.eos_id:
- break
- end += 1
- for j in range(start + 1, min(end + len(self.eos_id) + 1, self.max_length)):
- loss_mask[j] = 1
- i = end + len(self.eos_id) if end < len(input_ids) else len(input_ids)
- else:
- i += 1
- return loss_mask
-
- def __getitem__(self, index: int):
- sample = self.samples[index]
- image_paths = sample['image']
- prompt = self._create_chat_prompt(sample['conversations'])
- input_ids = self.tokenizer(prompt).input_ids[:self.max_length]
- input_ids += [self.tokenizer.pad_token_id] * (self.max_length - len(input_ids))
- loss_mask = self._generate_loss_mask(input_ids)
-
- X = torch.tensor(input_ids[:-1], dtype=torch.long)
- Y = torch.tensor(input_ids[1:], dtype=torch.long)
- loss_mask = torch.tensor(loss_mask[1:], dtype=torch.long)
-
- image_tensors = []
- for image_name in image_paths.split(','):
- image_name = image_name.strip()
- image = Image.open(f'{self.images_path}/{image_name}')
- image_tensor = MiniMindVLM.image2tensor(image, self.preprocess)
- image_tensors.append(image_tensor)
- image_tensors = torch.stack(image_tensors, dim=0)
-
- return X, Y, loss_mask, image_tensors
diff --git a/images/VLM-structure-moe.png b/docs/images/VLM-structure-moe.png
similarity index 100%
rename from images/VLM-structure-moe.png
rename to docs/images/VLM-structure-moe.png
diff --git a/images/VLM-structure.png b/docs/images/VLM-structure.png
similarity index 100%
rename from images/VLM-structure.png
rename to docs/images/VLM-structure.png
diff --git a/images/llava-structure.png b/docs/images/llava-structure.png
similarity index 100%
rename from images/llava-structure.png
rename to docs/images/llava-structure.png
diff --git a/images/logo.png b/docs/images/logo.png
similarity index 100%
rename from images/logo.png
rename to docs/images/logo.png
diff --git a/images/minimind-v-input.png b/docs/images/minimind-v-input.png
similarity index 100%
rename from images/minimind-v-input.png
rename to docs/images/minimind-v-input.png
diff --git a/images/minimind2-v.gif b/docs/images/minimind2-v.gif
similarity index 100%
rename from images/minimind2-v.gif
rename to docs/images/minimind2-v.gif
diff --git a/images/pretrain_loss.png b/docs/images/pretrain_loss.png
similarity index 100%
rename from images/pretrain_loss.png
rename to docs/images/pretrain_loss.png
diff --git a/images/sft_loss.png b/docs/images/sft_loss.png
similarity index 100%
rename from images/sft_loss.png
rename to docs/images/sft_loss.png
diff --git a/docs/index.md b/docs/index.md
new file mode 100644
index 0000000..58c6bf1
--- /dev/null
+++ b/docs/index.md
@@ -0,0 +1,90 @@
+# Welcome to MiniMind-V!
+
+
+ 
+ "The Greatest Path is the Simplest"
+
+
+## 📌 Introduction
+
+MiniMind-V is a super-small multimodal vision-language model project trained completely from scratch, requiring **only 1.3 RMB + 1 hour** to train a **26M** parameter vision-language model!
+
+- **MiniMind-V** series is extremely lightweight, the smallest version is **1/7000** the size of GPT-3
+- **MiniMind-V** is an extension of the visual capabilities of the [MiniMind](https://github.com/jingyaogong/minimind) pure language model
+- The project open-sources the minimalist structure of VLM models, including:
+ - Dataset cleaning
+ - Pretraining
+ - Supervised Fine-Tuning (SFT)
+ - Multi-image dialogue support
+- All core algorithm code is reconstructed from scratch using native PyTorch, without relying on third-party abstract interfaces
+- This is not only a full-stage open-source reproduction of vision-language models, but also a concise tutorial for getting started with VLMs
+
+!!! note "Training Cost"
+ "1 hour" is based on NVIDIA 3090 hardware (single card) testing `1 epoch`, "1.3 RMB" refers to GPU server rental cost
+
+## ✨ Key Features
+
+- **Ultra-low cost**: Single 3090, 1 hour, 1.3 RMB to train a vision-language model from scratch
+- **Complete pipeline**: Covers Visual Encoder, Projection, Pretraining, SFT full process
+- **Education-friendly**: Clean code, suitable for learning VLM principles
+- **Ecosystem compatible**: Supports `transformers` and mainstream inference frameworks
+
+## 📊 Model List
+
+| Model (Size) | Inference Memory (Approx.) | Release |
+|------------------------|----------------------------|------------|
+| MiniMind2-V (104M) | 0.6 GB | 2025.02.20 |
+| MiniMind2-Small-V (26M) | 1.1 GB | 2025.02.20 |
+| minimind-v-v1-small (27M) | 0.6 GB | 2024.10.04 |
+| minimind-v-v1 (109M) | 1.1 GB | 2024.10.04 |
+
+## 🚀 Quick Navigation
+
+- [Quick Start](quickstart.md) - Environment setup, model download, quick testing
+- [Model Training](training.md) - Pretraining, SFT training full process
+
+## 🔗 Related Links
+
+- **GitHub**: [https://github.com/jingyaogong/minimind-v](https://github.com/jingyaogong/minimind-v)
+- **HuggingFace**: [MiniMind-V Collection](https://huggingface.co/collections/jingyaogong/minimind-v-67000833fb60b3a2e1f3597d)
+- **ModelScope**: [MiniMind-V Models](https://www.modelscope.cn/profile/gongjy)
+- **Online Demo**: [ModelScope Studio](https://www.modelscope.cn/studios/gongjy/MiniMind-V)
+
+## 🎯 Project Highlights
+
+**"Building a plane with Legos is much more exciting than flying in first class!"**
+
+Is it really as complex as imagined to build a VLM-based multimodal large model? How is the code implementation done? Is the training process difficult?
+
+Now, let's explore the answers and feel the joy of creation together!
+
+
+ 
+
+
+## 📚 Core Concepts
+
+### What is a Vision-Language Model (VLM)?
+
+A Vision-Language Model is a multimodal model that can understand both images and text simultaneously. MiniMind-V achieves this through:
+
+1. Using a **Visual Encoder** (CLIP model) to convert images into feature vectors
+2. Aligning visual and text feature spaces through a **Projection layer**
+3. Injecting image features into the language model to enable image-text understanding
+
+### MiniMind-V Design Philosophy
+
+- **Minimal modifications**: Based on the MiniMind language model, only adding Visual Encoder and Projection submodules
+- **Core algorithm changes < 50 lines**: Extremely low migration difficulty
+- **Simple yet effective**: Uses linear projection for cross-modal alignment, simple but effective
+
+## 🎓 Target Audience
+
+- Beginners who want to understand vision-language model principles
+- Researchers who want to quickly build VLM prototypes
+- Developers interested in multimodal large models
+- Teams needing low-cost training of customized vision-language models
+
+## 💡 Get Started
+
+Ready? Let's start your VLM journey from the [Quick Start](quickstart.md) page!
diff --git a/docs/quickstart.md b/docs/quickstart.md
new file mode 100644
index 0000000..8d078cb
--- /dev/null
+++ b/docs/quickstart.md
@@ -0,0 +1,192 @@
+# Quick Start
+
+This page will help you quickly get started with the MiniMind-V project.
+
+## 📋 Requirements
+
+- **Python**: 3.10+
+- **PyTorch**: 1.12+
+- **CUDA**: 12.2+ (optional, for GPU acceleration)
+- **VRAM**: At least 8GB (24GB recommended)
+
+!!! tip "Hardware Configuration Reference"
+ - CPU: Intel i9-10980XE @ 3.00GHz
+ - RAM: 128 GB
+ - GPU: NVIDIA GeForce RTX 3090 (24GB)
+ - Ubuntu: 20.04
+ - CUDA: 12.2
+ - Python: 3.10.16
+
+## 🚀 Testing Existing Models
+
+### Step 0: Preparation
+
+```bash
+# Clone the repository
+git clone https://github.com/jingyaogong/minimind-v
+cd minimind-v
+```
+
+```bash
+# Download CLIP model to ./model/vision_model directory
+git clone https://huggingface.co/openai/clip-vit-base-patch16
+# or
+git clone https://www.modelscope.cn/models/openai-mirror/clip-vit-base-patch16
+```
+
+### Step 1: Install Dependencies
+
+```bash
+pip install -r requirements.txt -i https://pypi.tuna.tsinghua.edu.cn/simple
+```
+
+!!! warning "Torch CUDA Check"
+ After installation, test if Torch can use CUDA:
+ ```python
+ import torch
+ print(torch.cuda.is_available())
+ ```
+ If unavailable, download the whl file from [torch_stable](https://download.pytorch.org/whl/torch_stable.html) for installation.
+
+### Step 2: Download Model
+
+Download pretrained models from HuggingFace or ModelScope:
+
+```bash
+# From HuggingFace
+git clone https://huggingface.co/jingyaogong/MiniMind2-V
+
+# Or from ModelScope
+git clone https://www.modelscope.cn/models/gongjy/MiniMind2-V.git
+```
+
+### Step 3: Command Line Q&A
+
+```bash
+# load_from='model': load native PyTorch weights, load_from='other path': load transformers format
+python eval_vlm.py --load_from model --weight sft_vlm
+
+# Or use transformers format model
+python eval_vlm.py --load_from MiniMind2-V
+```
+
+### Step 4: Start WebUI (Optional)
+
+```bash
+python scripts/web_demo_vlm.py
+```
+
+Visit `http://localhost:8501` to use the web interface for image-text dialogue.
+
+## 📝 Effect Testing
+
+### Single Image Dialogue Examples
+
+**Test Image 1: City Street Scene**
+
+```text
+Q: Describe the content of this image
+A: The image shows a busy city street with tall buildings on both sides of a long road.
+ The street is packed with cars, trucks, and buses, along with many other vehicles...
+```
+
+**Test Image 2: Panda**
+
+```text
+Q: What animal is in this image?
+A: The image shows a white brown bear sitting on the grass, next to a large bear with brown spots.
+ This bear seems shy or playful as it lies on the grass, resting...
+```
+
+### Model Performance
+
+| Model | Parameters | Inference Speed | Image Understanding |
+|-------|-----------|-----------------|---------------------|
+| MiniMind2-V | 104M | Fast | 😊😊😊😊😊😊 |
+| MiniMind2-Small-V | 26M | Very Fast | 😊😊😊😊 |
+
+## 🔧 Loading from PyTorch Weights
+
+If you want to use native PyTorch model weights (`*.pth` files):
+
+### Download Weight Files
+
+Download the required weight files from:
+
+- [HuggingFace - MiniMind2-V-PyTorch](https://huggingface.co/jingyaogong/MiniMind2-V-PyTorch)
+- [ModelScope - MiniMind2-V-PyTorch](https://www.modelscope.cn/models/gongjy/MiniMind2-V-PyTorch)
+
+Files needed:
+- `sft_vlm_512.pth` or `sft_vlm_768.pth` (SFT model weights)
+- Optional: `pretrain_vlm_512.pth` or `pretrain_vlm_768.pth` (pretrained model weights)
+
+### Run Testing
+
+```bash
+# Test SFT model
+python eval_vlm.py --weight sft_vlm
+
+# Test pretrain model
+python eval_vlm.py --weight pretrain_vlm
+```
+
+## 📊 Model Architecture
+
+MiniMind-V adds Visual Encoder and Projection layers on top of the MiniMind language model:
+
+
+
+### Core Components
+
+1. **Visual Encoder (CLIP)**
+ - Uses `clip-vit-base-patch16` model
+ - Input image size: 224×224
+ - Output: 196×768 dimensional visual tokens
+
+2. **Projection Layer**
+ - Simple linear transformation
+ - Aligns visual tokens to text embedding space
+
+3. **Language Model (MiniMind)**
+ - Inherits from MiniMind language model
+ - Supports text generation and dialogue
+
+### Model Parameter Configuration
+
+| Model Name | Params | d_model | n_layers | kv_heads | q_heads |
+|-----------|--------|---------|----------|----------|---------|
+| MiniMind2-Small-V | 26M | 512 | 8 | 2 | 8 |
+| MiniMind2-V | 104M | 768 | 16 | 2 | 8 |
+
+## 🎯 Next Steps
+
+- Check [Model Training](training.md) to learn how to train your own vision-language model from scratch
+- Read the source code to understand VLM implementation principles
+- Try testing with your own images
+
+## ❓ Common Issues
+
+### 1. Model fails to load?
+
+Ensure all dependency files are downloaded:
+- CLIP model weights
+- MiniMind-V model weights
+- tokenizer configuration files
+
+### 2. Out of memory?
+
+- Use the 26M Small version
+- Reduce batch size
+- Use CPU inference (slower)
+
+### 3. Poor image recognition?
+
+- Ensure image quality is clear
+- Try adjusting image size
+- Use more specific question descriptions
+
+## 🔗 Related Resources
+
+- **Online Demo**: [ModelScope Studio](https://www.modelscope.cn/studios/gongjy/MiniMind-V)
+- **Video Introduction**: [Bilibili](https://www.bilibili.com/video/BV1Sh1vYBEzY)
+- **Project Home**: [GitHub](https://github.com/jingyaogong/minimind-v)
diff --git a/docs/training.md b/docs/training.md
new file mode 100644
index 0000000..ee2fe60
--- /dev/null
+++ b/docs/training.md
@@ -0,0 +1,439 @@
+# Model Training
+
+This page introduces how to train MiniMind-V vision-language models from scratch.
+
+## 📊 Data Preparation
+
+### 1. Download Dataset
+
+Download datasets from the following addresses:
+
+- [ModelScope - minimind-v_dataset](https://www.modelscope.cn/datasets/gongjy/minimind-v_dataset)
+- [HuggingFace - minimind-v_dataset](https://huggingface.co/datasets/jingyaogong/minimind-v_dataset)
+
+Create `./dataset` directory and place data files:
+
+```bash
+./dataset/
+├── pretrain_vlm_data.jsonl # Pretrain data (~595K samples)
+├── sft_vlm_data.jsonl # Single-image SFT data (~300K samples)
+├── sft_vlm_data_multi.jsonl # Multi-image SFT data (~13.6K samples)
+├── eval_images/ # Test images
+│ ├── 城市车水马龙-city-traffic.jpg
+│ ├── 熊猫草地-Panda-Grassland.jpg
+│ └── ...
+└── sft_multi_images_trans_image.zip # Multi-image data images (needs extraction)
+```
+
+!!! tip "Dataset Notes"
+ - `*.jsonl` are Q&A datasets
+ - `*images` are accompanying image data, need to be extracted after download
+ - Please reserve about 5GB space for the dataset
+ - If space is insufficient, try skipping pretrain and go directly to SFT training
+
+### 2. Data Format
+
+**Pretrain Data Format** (`pretrain_vlm_data.jsonl`):
+
+```json
+{
+ "conversations": [
+ {
+ "role": "user",
+ "content": "Provide a brief description of the given image.\n"
+ },
+ {
+ "role": "assistant",
+ "content": "Olive oil is a healthy ingredient for free use."
+ }
+ ],
+ "image": "GCC_train_002582585.jpg"
+}
+```
+
+**Single-Image SFT Data Format** (`sft_vlm_data.jsonl`):
+
+```json
+{
+ "conversations": [
+ {
+ "role": "user",
+ "content": "What impact does the location of the alarm clock have on sleep quality?"
+ },
+ {
+ "role": "assistant",
+ "content": "Place the digital alarm clock on the nightstand..."
+ }
+ ],
+ "image": "train-00000-of-00001_image_0_0.jpg"
+}
+```
+
+**Multi-Image SFT Data Format** (`sft_vlm_data_multi.jsonl`):
+
+```json
+{
+ "conversations": [
+ {
+ "role": "user",
+ "content": "context: Source Image: Target Image: Instruction: What is the correct image edit instruction?"
+ },
+ {
+ "role": "assistant",
+ "content": "take the people out of the back in the photo..."
+ }
+ ],
+ "image": "0.jpg, 1.jpg"
+}
+```
+
+### 3. Data Source
+
+- **Pretrain Data**: [Chinese-LLaVA-Vision](https://huggingface.co/datasets/LinkSoul/Chinese-LLaVA-Vision-Instructions)
+ - Approximately 570,000 images from CC-3M and COCO 2014
+
+- **SFT Data**: [llava-en-zh-300k](https://huggingface.co/datasets/BUAADreamer/llava-en-zh-300k)
+ - 300K instruction fine-tuning data
+ - 150K images
+ - Translated content, better Chinese support
+
+## 🎯 Training Pipeline
+
+All training scripts are located in the `./trainer` directory.
+
+### Step 0: Prepare Base Language Model
+
+Download pure language model weights to the `./out` directory (as the base language model for training VLM):
+
+```bash
+# Download 512-dim model
+wget https://huggingface.co/jingyaogong/MiniMind2-V-PyTorch/blob/main/llm_512.pth
+
+# Or download 768-dim model
+wget https://huggingface.co/jingyaogong/MiniMind2-V-PyTorch/blob/main/llm_768.pth
+```
+
+### Step 1: Pretraining (Learning Image Description)
+
+The pretraining stage teaches the model general image knowledge, such as a deer is a deer, a dog is a dog.
+
+```bash
+# Basic training command (start from LLM weights, train vision_proj only)
+python trainer/train_pretrain_vlm.py --epochs 4 --from_weight llm
+
+# Multi-GPU training
+torchrun --nproc_per_node 2 trainer/train_pretrain_vlm.py --epochs 4 --from_weight llm
+
+# Resume training from checkpoint
+python trainer/train_pretrain_vlm.py --epochs 4 --from_resume 1
+```
+
+**Output weights**: `./out/pretrain_vlm_*.pth` (* is the model dimension, default is 512)
+
+!!! info "Training Duration"
+ - MiniMind2-Small-V (26M): ~1h (single 3090)
+ - MiniMind2-V (104M): ~3h (single 3090)
+
+**Training Strategy**:
+- Freeze Visual Encoder (CLIP model) gradients
+- Freeze LLM main parameters (only last layer learnable via `--freeze_llm True`)
+- Only train Vision Projection layer
+
+**Key Parameters**:
+- `--from_weight llm`: Start from LLM weights
+- `--freeze_llm True`: Freeze LLM parameters (pretrain only)
+- `--from_resume 1`: Resume from checkpoint
+- `--save_weight pretrain_vlm`: Save weight prefix name
+
+**Loss Curve**:
+
+
+
+### Step 2: Supervised Fine-Tuning (Learning Image-Caption Dialogue Style)
+
+The SFT stage teaches the model real image-text dialogue format, better aligning with human communication habits.
+
+```bash
+# Basic training command (start from pretrain weights, full parameter fine-tuning)
+python trainer/train_sft_vlm.py --epochs 2 --from_weight pretrain_vlm
+
+# Multi-GPU training
+torchrun --nproc_per_node 2 trainer/train_sft_vlm.py --epochs 2 --from_weight pretrain_vlm
+
+# Resume training from checkpoint
+python trainer/train_sft_vlm.py --epochs 4 --from_resume 1
+```
+
+**Output weights**: `./out/sft_vlm_*.pth`
+
+!!! info "Training Duration"
+ - MiniMind2-Small-V: ~1h (single 3090)
+ - MiniMind2-V: ~3h (single 3090)
+
+**Training Strategy**:
+- Freeze Visual Encoder (CLIP model) gradients
+- Train Vision Projection layer (all parameters learnable)
+- Train LLM (all parameters learnable)
+
+**Key Parameters**:
+- `--from_weight pretrain_vlm`: Start from pretrain weights
+- `--from_resume 1`: Resume from checkpoint
+- `--save_weight sft_vlm`: Save weight prefix name
+
+**Loss Curve**:
+
+
+
+### Step 3 (Optional): Multi-Image Fine-Tuning
+
+Multi-image fine-tuning provides a demo example based on bird comparison dataset.
+
+```bash
+python train_sft_vlm.py --epochs 4 --use_multi_image
+```
+
+**Notes**:
+- Multi-image dataset is relatively small and contains English conversations
+- Only includes two-image comparison scenarios
+- Fine-tuning effect is limited, provided as a reference approach
+
+!!! warning "Training Notes"
+ **Training Features:**
+
+ - **Checkpoint Resumption**: Add `--from_resume 1` parameter to continue from last interruption
+ - **GPU Count Changes**: Automatically convert steps when GPU count changes during resumption
+ - **Atomic Saving**: Use temporary file + replacement mechanism to prevent weight corruption
+ - **Dual File System**: Each save generates `out/**.pth` (model weights) and `checkpoints/**_resume.pth` (training state)
+
+ **Resume Example:**
+ ```bash
+ # Resume training after interruption
+ python trainer/train_sft_vlm.py --epochs 4 --from_resume 1
+ ```
+
+## 📈 Model Architecture Details
+
+MiniMind-V's structure only adds Visual Encoder and feature projection submodules:
+
+
+
+### Core Components
+
+1. **Visual Encoder**
+ - Uses [clip-vit-base-patch16](https://huggingface.co/openai/clip-vit-base-patch16)
+ - Based on ViT-B/16 architecture
+ - Input image: 224×224
+ - Patch size: 16×16
+ - Output: 196×768 dimensional features (196 visual tokens)
+
+2. **Projection Layer**
+ - Simple unbiased linear transformation
+ - Functions:
+ - Aligns 768-dimensional visual tokens to LLM text tokens
+ - Maps image features to the same space as text embeddings
+ - Achieves cross-modal feature alignment
+
+3. **Language Model**
+ - Fully inherits from MiniMind
+ - Minimal modifications (core algorithm changes < 50 lines)
+
+### Input-Output Mechanism
+
+**Input Format**:
+
+In `minimind-v`, a 196-character `@@@...@@@` placeholder is used to replace the image:
+
+```text
+@@@......@@@\nWhat is this image describing?
+```
+
+Why 196 characters? Because any image is encoded by the CLIP model as 196×768 dimensional tokens.
+
+
+
+**Multi-Image Implementation**:
+
+Achieved by injecting multiple `` placeholders, no framework modification needed.
+
+### Model Parameter Configuration
+
+| Model Name | Params | d_model | n_layers | kv_heads | q_heads | Visual Token |
+|-----------|--------|---------|----------|----------|---------|--------------|
+| MiniMind2-Small-V | 26M | 512 | 8 | 2 | 8 | 196×768 |
+| MiniMind2-V | 104M | 768 | 16 | 2 | 8 | 196×768 |
+
+## 🧪 Test Model
+
+### Test Trained Model
+
+Ensure the model `*.pth` file to be tested is in the `./out/` directory.
+
+```bash
+# Test SFT model (default)
+python eval_vlm.py --weight sft_vlm
+
+# Test pretrain model
+python eval_vlm.py --weight pretrain_vlm
+
+# Specify image directory
+python eval_vlm.py --weight sft_vlm --image_dir ./dataset/eval_images/
+```
+
+### Use Pre-Trained Model
+
+You can also directly download and use pre-trained `*.pth` files:
+
+- [HuggingFace - MiniMind2-V-PyTorch](https://huggingface.co/jingyaogong/MiniMind2-V-PyTorch)
+- [ModelScope - MiniMind2-V-PyTorch](https://www.modelscope.cn/models/gongjy/MiniMind2-V-PyTorch)
+
+## 🔧 Multi-GPU Training
+
+### DDP Method
+
+```bash
+torchrun --nproc_per_node N train_xxx.py
+```
+
+### DeepSpeed Method
+
+```bash
+deepspeed --master_port 29500 --num_gpus=N train_xxx.py
+```
+
+### Wandb Monitoring
+
+```bash
+# Login first
+wandb login
+
+# Enable wandb
+torchrun --nproc_per_node N train_xxx.py --use_wandb
+```
+
+By adding the `--use_wandb` parameter, you can log the training process. After training, you can view the process on the wandb website.
+
+You can specify the project name and run name by modifying `wandb_project` and `wandb_run_name` parameters.
+
+## 💰 Training Cost
+
+Based on single NVIDIA 3090:
+
+| Dataset Combination | Training Time | Cost (Approx.) | Effect |
+|---------------------|---------------|----------------|---------|
+| pretrain (1 epoch) + sft (1 epoch) | 2h | ≈1.3 RMB | 😊😊😊 Basic dialogue |
+| pretrain (4 epochs) + sft (4 epochs) | 8h | ≈5.4 RMB | 😊😊😊😊😊 Better effect |
+
+!!! success "Quick Reproduction"
+ Using single epoch of pretrain and SFT, single 3090 only needs **2 hours + 1.3 RMB** to train a vision-language ChatBot!
+
+## 📝 Training Tips
+
+### 1. Reduce Memory Usage
+
+- Use smaller batch size
+- Use 512-dim model instead of 768
+- Enable gradient checkpointing (if implemented)
+- Use DeepSpeed ZeRO optimization
+
+### 2. Accelerate Training
+
+- Use multi-GPU training (DDP or DeepSpeed)
+- Use mixed precision training (FP16)
+- Reduce image resolution (but may affect performance)
+
+### 3. Improve Performance
+
+- Increase training epochs
+- Use larger model (768 vs 512)
+- Use higher quality datasets
+- Adjust learning rate
+
+### 4. Checkpoint Resumption
+
+MiniMind-V now supports complete checkpoint resumption:
+
+- **Automatic Saving**: Training state saved every N steps (default 100)
+- **Easy Resumption**: Just add `--from_resume 1` to continue training
+- **GPU Flexibility**: Automatically adapts when GPU count changes
+- **Safe Storage**: Atomic file operations prevent corruption
+
+**Usage Example:**
+```bash
+# Start training
+python trainer/train_sft_vlm.py --epochs 10
+
+# Training interrupted at epoch 5...
+# Resume from checkpoint
+python trainer/train_sft_vlm.py --epochs 10 --from_resume 1
+
+# Resume with different GPU count (4 GPUs -> 2 GPUs)
+torchrun --nproc_per_node 2 trainer/train_sft_vlm.py --epochs 10 --from_resume 1
+```
+
+## 🎓 Core Principles
+
+### Why Pretraining?
+
+Pretraining teaches the model basic image description capabilities, establishing fundamental mapping relationships between image features and text.
+
+### Why SFT?
+
+SFT teaches the model real dialogue formats, making its outputs more aligned with human communication habits rather than simple image descriptions.
+
+### Why Freeze CLIP?
+
+CLIP is already a powerful pre-trained visual encoder. Freezing its parameters can:
+- Significantly reduce trainable parameters
+- Speed up training
+- Prevent overfitting
+- Lower training costs
+
+### Future Improvement Directions
+
+```text
+> More complex Projection layers (e.g., Cross-Attention) may bring better cross-modal alignment
+> Use larger CLIP models (e.g., large series) for finer-grained image features
+> Increase image resolution (currently only 224×224, dataset uses 128×128)
+> Expand multi-image datasets to support more complex multi-image understanding scenarios
+```
+
+## 📚 Related Resources
+
+- **Base Language Model**: [MiniMind](https://github.com/jingyaogong/minimind)
+- **Reference Paper**: [LlaVA](https://arxiv.org/pdf/2304.08485)
+- **Visual Encoder**: [CLIP](https://huggingface.co/openai/clip-vit-base-patch16)
+
+## ❓ Common Issues
+
+### 1. Out of memory during training?
+
+- Reduce batch_size
+- Use 512-dim model
+- Use DeepSpeed ZeRO
+- Single-GPU training instead of multi-GPU
+
+### 2. Training loss not decreasing?
+
+- Check if dataset path is correct
+- Check if learning rate is appropriate
+- Confirm base LLM model is loaded correctly
+- Check for gradient explosion/vanishing
+
+### 3. Multi-GPU training error?
+
+- Ensure all GPUs are visible and CUDA versions are consistent
+- Check if port is occupied (modify `--master_port`)
+- Try using DeepSpeed instead of DDP
+
+### 4. How to use custom dataset?
+
+Prepare jsonl files and corresponding images according to the data format above, and modify the data path in the training script.
+
+## 🎯 Next Steps
+
+Congratulations! You've learned the complete training process of MiniMind-V. Now you can:
+
+- Start training your own vision-language model
+- Try using different datasets
+- Explore the source code implementation
+- Contribute your improvements to the project
diff --git a/eval_vlm.py b/eval_vlm.py
deleted file mode 100644
index 93357ce..0000000
--- a/eval_vlm.py
+++ /dev/null
@@ -1,73 +0,0 @@
-import argparse
-import os
-import warnings
-import torch
-from PIL import Image
-from transformers import AutoTokenizer, AutoModelForCausalLM, TextStreamer
-from model.model_vlm import MiniMindVLM, VLMConfig
-from trainer.trainer_utils import setup_seed
-warnings.filterwarnings('ignore')
-
-def init_model(args):
- tokenizer = AutoTokenizer.from_pretrained(args.load_from)
- if 'model' in args.load_from:
- moe_suffix = '_moe' if args.use_moe else ''
- ckp = f'./{args.save_dir}/{args.weight}_{args.hidden_size}{moe_suffix}.pth'
- model = MiniMindVLM(
- VLMConfig(hidden_size=args.hidden_size, num_hidden_layers=args.num_hidden_layers, use_moe=bool(args.use_moe)),
- vision_model_path="./model/vision_model/clip-vit-base-patch16"
- )
- state_dict = torch.load(ckp, map_location=args.device)
- model.load_state_dict({k: v for k, v in state_dict.items() if 'mask' not in k}, strict=False)
- else:
- model = AutoModelForCausalLM.from_pretrained(args.load_from, trust_remote_code=True)
- model.vision_encoder, model.processor = MiniMindVLM.get_vision_model("./model/vision_model/clip-vit-base-patch16")
-
- print(f'VLM模型参数: {sum(p.numel() for p in model.parameters() if p.requires_grad) / 1e6:.2f} M(illion)')
- preprocess = model.processor
- return model.eval().to(args.device), tokenizer, preprocess
-
-
-def main():
- parser = argparse.ArgumentParser(description="MiniMind-V Chat")
- parser.add_argument('--load_from', default='model', type=str, help="模型加载路径(model=原生torch权重,其他路径=transformers格式)")
- parser.add_argument('--save_dir', default='out', type=str, help="模型权重目录")
- parser.add_argument('--weight', default='sft_vlm', type=str, help="权重名称前缀(pretrain_vlm, sft_vlm)")
- parser.add_argument('--hidden_size', default=512, type=int, help="隐藏层维度(512=Small-26M, 768=Base-104M)")
- parser.add_argument('--num_hidden_layers', default=8, type=int, help="隐藏层数量(Small=8, Base=16)")
- parser.add_argument('--use_moe', default=0, type=int, choices=[0, 1], help="是否使用MoE架构(0=否,1=是)")
- parser.add_argument('--max_new_tokens', default=512, type=int, help="最大生成长度")
- parser.add_argument('--temperature', default=0.65, type=float, help="生成温度,控制随机性(0-1,越大越随机)")
- parser.add_argument('--top_p', default=0.85, type=float, help="nucleus采样阈值(0-1)")
- parser.add_argument('--image_dir', default='./dataset/eval_images/', type=str, help="测试图像目录")
- parser.add_argument('--device', default='cuda' if torch.cuda.is_available() else 'cpu', type=str, help="运行设备")
- args = parser.parse_args()
-
- model, tokenizer, preprocess = init_model(args)
- streamer = TextStreamer(tokenizer, skip_prompt=True, skip_special_tokens=True)
- # 自动测试image_dir中的所有图像
- prompt = "仔细看一下这张图:\n\n\n\n描述一下这个图像的内容。"
- for image_file in sorted(os.listdir(args.image_dir)):
- if image_file.lower().endswith(('.png', '.jpg', '.jpeg', '.bmp')):
- setup_seed(2026) # or setup_seed(random.randint(1, 10000))
- image_path = os.path.join(args.image_dir, image_file)
- image = Image.open(image_path).convert('RGB')
- pixel_values = MiniMindVLM.image2tensor(image, preprocess).to(args.device).unsqueeze(0)
-
- messages = [{"role": "user", "content": prompt.replace('', model.params.image_special_token)}]
- inputs_text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
- inputs = tokenizer(inputs_text, return_tensors="pt", truncation=True).to(args.device)
-
- print(f'[图像]: {image_file}')
- print(f'👶: {prompt.replace('\n', '\\n')}')
- print('🤖️: ', end='')
- model.generate(
- inputs=inputs["input_ids"], attention_mask=inputs["attention_mask"],
- max_new_tokens=args.max_new_tokens, do_sample=True, streamer=streamer,
- pad_token_id=tokenizer.pad_token_id, eos_token_id=tokenizer.eos_token_id,
- top_p=args.top_p, temperature=args.temperature, pixel_values=pixel_values
- )
- print('\n\n')
-
-if __name__ == "__main__":
- main()
diff --git a/mkdocs.yml b/mkdocs.yml
new file mode 100644
index 0000000..cef3ffb
--- /dev/null
+++ b/mkdocs.yml
@@ -0,0 +1,66 @@
+site_name: MiniMind-V
+site_description: MiniMind-V - 超小多模态视觉语言模型 / Super-Small Multimodal Vision-Language Model
+site_author: jingyaogong
+site_url: https://minimind-v.readthedocs.io/
+
+# 搜索插件配置
+plugins:
+ - search:
+ lang: en
+
+# 主题配置
+theme:
+ name: material
+ favicon: images/logo.png
+ icon:
+ logo: material/book-open-page-variant
+ palette:
+ # 浅色模式
+ - scheme: default
+ primary: white
+ accent: blue
+ toggle:
+ icon: material/brightness-7
+ name: 切换至深色模式
+ # 深色模式
+ - scheme: slate
+ primary: black
+ accent: blue
+ toggle:
+ icon: material/brightness-4
+ name: 切换至浅色模式
+ features:
+ # - navigation.instant # 与多语言切换不兼容,已禁用
+ - navigation.tracking # 锚点跟踪
+ - navigation.sections # 导航分组
+ - navigation.expand # 默认展开导航
+ - navigation.top # 返回顶部按钮
+ - search.suggest # 搜索建议
+ - search.highlight # 搜索高亮
+ - content.code.copy # 代码复制按钮
+ - toc.follow # 目录跟随
+ - toc.integrate # 目录集成到左侧边栏
+ language: en
+
+# 导航结构
+nav:
+ - Home: index.md
+ - Quick Start: quickstart.md
+ - Model Training: training.md
+
+# Markdown 扩展
+markdown_extensions:
+ - toc:
+ permalink: true
+ - admonition
+ - pymdownx.highlight:
+ anchor_linenums: true
+ - pymdownx.inlinehilite
+ - pymdownx.snippets
+ - pymdownx.superfences
+ - pymdownx.details
+ - pymdownx.tabbed:
+ alternate_style: true
+ - attr_list
+ - md_in_html
+
diff --git a/model/__init__.py b/model/__init__.py
deleted file mode 100644
index e69de29..0000000
diff --git a/model/model_minimind.py b/model/model_minimind.py
deleted file mode 100644
index ecd99b6..0000000
--- a/model/model_minimind.py
+++ /dev/null
@@ -1,470 +0,0 @@
-# 📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘
-# MiniMind Config
-# 📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘
-
-from transformers import PretrainedConfig
-
-
-class MiniMindConfig(PretrainedConfig):
- model_type = "minimind"
-
- def __init__(
- self,
- dropout: float = 0.0,
- bos_token_id: int = 1,
- eos_token_id: int = 2,
- hidden_act: str = 'silu',
- hidden_size: int = 512,
- intermediate_size: int = None,
- max_position_embeddings: int = 32768,
- num_attention_heads: int = 8,
- num_hidden_layers: int = 8,
- num_key_value_heads: int = 2,
- vocab_size: int = 6400,
- rms_norm_eps: float = 1e-05,
- rope_theta: int = 1000000.0,
- inference_rope_scaling: bool = False,
- flash_attn: bool = True,
- ####################################################
- # Here are the specific configurations of MOE
- # When use_moe is false, the following is invalid
- ####################################################
- use_moe: bool = False,
- num_experts_per_tok: int = 2,
- n_routed_experts: int = 4,
- n_shared_experts: int = 1,
- scoring_func: str = 'softmax',
- aux_loss_alpha: float = 0.1,
- seq_aux: bool = True,
- norm_topk_prob: bool = True,
- **kwargs
- ):
- super().__init__(**kwargs)
- self.dropout = dropout
- self.bos_token_id = bos_token_id
- self.eos_token_id = eos_token_id
- self.hidden_act = hidden_act
- self.hidden_size = hidden_size
- self.intermediate_size = intermediate_size
- self.max_position_embeddings = max_position_embeddings
- self.num_attention_heads = num_attention_heads
- self.num_hidden_layers = num_hidden_layers
- self.num_key_value_heads = num_key_value_heads
- self.vocab_size = vocab_size
- self.rms_norm_eps = rms_norm_eps
- self.rope_theta = rope_theta
- self.inference_rope_scaling = inference_rope_scaling
- # 外推长度 = factor * original_max_position_embeddings
- self.rope_scaling = {
- "beta_fast": 4,
- "beta_slow": 1,
- "factor": 4,
- "original_max_position_embeddings": 2048,
- "type": "yarn"
- } if self.inference_rope_scaling else None
- self.flash_attn = flash_attn
- ####################################################
- # Here are the specific configurations of MOE
- # When use_moe is false, the following is invalid
- ####################################################
- self.use_moe = use_moe
- self.num_experts_per_tok = num_experts_per_tok # 每个token选择的专家数量
- self.n_routed_experts = n_routed_experts # 总的专家数量
- self.n_shared_experts = n_shared_experts # 共享专家
- self.scoring_func = scoring_func # 评分函数,默认为'softmax'
- self.aux_loss_alpha = aux_loss_alpha # 辅助损失的alpha参数
- self.seq_aux = seq_aux # 是否在序列级别上计算辅助损失
- self.norm_topk_prob = norm_topk_prob # 是否标准化top-k概率
-
-
-# 📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘
-# MiniMind Model
-# 📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘📘
-
-import math
-import torch
-import torch.nn.init as init
-import torch.nn.functional as F
-from torch import nn
-from transformers.activations import ACT2FN
-from typing import Optional, Tuple, List, Union
-from transformers import PreTrainedModel, GenerationMixin, PretrainedConfig
-from transformers.modeling_outputs import CausalLMOutputWithPast
-
-
-class RMSNorm(torch.nn.Module):
- def __init__(self, dim: int, eps: float = 1e-5):
- super().__init__()
- self.eps = eps
- self.weight = nn.Parameter(torch.ones(dim))
-
- def _norm(self, x):
- return x * torch.rsqrt(x.pow(2).mean(-1, keepdim=True) + self.eps)
-
- def forward(self, x):
- return self.weight * self._norm(x.float()).type_as(x)
-
-
-def precompute_freqs_cis(dim: int, end: int = int(32 * 1024), rope_base: float = 1e6,
- rope_scaling: Optional[dict] = None):
- freqs = 1.0 / (rope_base ** (torch.arange(0, dim, 2)[: (dim // 2)].float() / dim))
- if rope_scaling is not None:
- orig_max, factor, beta_fast, beta_slow = (
- rope_scaling.get("original_max_position_embeddings", 2048), rope_scaling.get("factor", 4),
- rope_scaling.get("beta_fast", 4.0), rope_scaling.get("beta_slow", 1.0)
- )
- if end / orig_max > 1.0:
- corr_dim = next((i for i in range(dim // 2) if 2 * math.pi / freqs[i] > orig_max), dim // 2)
- power = torch.arange(0, dim // 2, device=freqs.device).float() / max(dim // 2 - 1, 1)
- beta = beta_slow + (beta_fast - beta_slow) * power
- # λ = (β·α - β + 1)/(β·α) YaRN标准公式
- scale = torch.where(torch.arange(dim // 2, device=freqs.device) < corr_dim, (beta * factor - beta + 1) / (beta * factor), 1.0 / factor)
- freqs = freqs * scale
-
- t = torch.arange(end, device=freqs.device)
- freqs = torch.outer(t, freqs).float()
- freqs_cos = torch.cat([torch.cos(freqs), torch.cos(freqs)], dim=-1)
- freqs_sin = torch.cat([torch.sin(freqs), torch.sin(freqs)], dim=-1)
- return freqs_cos, freqs_sin
-
-
-def apply_rotary_pos_emb(q, k, cos, sin, position_ids=None, unsqueeze_dim=1):
- def rotate_half(x):
- return torch.cat((-x[..., x.shape[-1] // 2:], x[..., : x.shape[-1] // 2]), dim=-1)
-
- q_embed = (q * cos.unsqueeze(unsqueeze_dim)) + (rotate_half(q) * sin.unsqueeze(unsqueeze_dim))
- k_embed = (k * cos.unsqueeze(unsqueeze_dim)) + (rotate_half(k) * sin.unsqueeze(unsqueeze_dim))
- return q_embed, k_embed
-
-
-def repeat_kv(x: torch.Tensor, n_rep: int) -> torch.Tensor:
- """torch.repeat_interleave(x, dim=2, repeats=n_rep)"""
- bs, slen, num_key_value_heads, head_dim = x.shape
- if n_rep == 1:
- return x
- return (
- x[:, :, :, None, :].expand(bs, slen, num_key_value_heads, n_rep, head_dim).reshape(bs, slen, num_key_value_heads * n_rep, head_dim)
- )
-
-
-class Attention(nn.Module):
- def __init__(self, args: MiniMindConfig):
- super().__init__()
- self.num_key_value_heads = args.num_attention_heads if args.num_key_value_heads is None else args.num_key_value_heads
- assert args.num_attention_heads % self.num_key_value_heads == 0
- self.n_local_heads = args.num_attention_heads
- self.n_local_kv_heads = self.num_key_value_heads
- self.n_rep = self.n_local_heads // self.n_local_kv_heads
- self.head_dim = args.hidden_size // args.num_attention_heads
- self.q_proj = nn.Linear(args.hidden_size, args.num_attention_heads * self.head_dim, bias=False)
- self.k_proj = nn.Linear(args.hidden_size, self.num_key_value_heads * self.head_dim, bias=False)
- self.v_proj = nn.Linear(args.hidden_size, self.num_key_value_heads * self.head_dim, bias=False)
- self.o_proj = nn.Linear(args.num_attention_heads * self.head_dim, args.hidden_size, bias=False)
- self.attn_dropout = nn.Dropout(args.dropout)
- self.resid_dropout = nn.Dropout(args.dropout)
- self.dropout = args.dropout
- self.flash = hasattr(torch.nn.functional, 'scaled_dot_product_attention') and args.flash_attn
- # print("WARNING: using slow attention. Flash Attention requires PyTorch >= 2.0")
-
- def forward(self,
- x: torch.Tensor,
- position_embeddings: Tuple[torch.Tensor, torch.Tensor], # 修改为接收cos和sin
- past_key_value: Optional[Tuple[torch.Tensor, torch.Tensor]] = None,
- use_cache=False,
- attention_mask: Optional[torch.Tensor] = None):
- bsz, seq_len, _ = x.shape
- xq, xk, xv = self.q_proj(x), self.k_proj(x), self.v_proj(x)
- xq = xq.view(bsz, seq_len, self.n_local_heads, self.head_dim)
- xk = xk.view(bsz, seq_len, self.n_local_kv_heads, self.head_dim)
- xv = xv.view(bsz, seq_len, self.n_local_kv_heads, self.head_dim)
-
- cos, sin = position_embeddings
- xq, xk = apply_rotary_pos_emb(xq, xk, cos[:seq_len], sin[:seq_len])
-
- # kv_cache实现
- if past_key_value is not None:
- xk = torch.cat([past_key_value[0], xk], dim=1)
- xv = torch.cat([past_key_value[1], xv], dim=1)
- past_kv = (xk, xv) if use_cache else None
-
- xq, xk, xv = (
- xq.transpose(1, 2),
- repeat_kv(xk, self.n_rep).transpose(1, 2),
- repeat_kv(xv, self.n_rep).transpose(1, 2)
- )
-
- if self.flash and seq_len > 1 and (attention_mask is None or torch.all(attention_mask == 1)):
- attn_mask = (
- None
- if attention_mask is None
- else attention_mask.view(bsz, 1, 1, -1).expand(bsz, self.n_local_heads, seq_len, -1).bool()
- )
-
- output = F.scaled_dot_product_attention(xq, xk, xv, attn_mask=attn_mask, dropout_p=self.dropout if self.training else 0.0, is_causal=True)
- else:
- scores = (xq @ xk.transpose(-2, -1)) / math.sqrt(self.head_dim)
- scores = scores + torch.triu(
- torch.full((seq_len, seq_len), float("-inf"), device=scores.device),
- diagonal=1
- ).unsqueeze(0).unsqueeze(0) # scores+mask
-
- if attention_mask is not None:
- extended_attention_mask = attention_mask.unsqueeze(1).unsqueeze(2)
- extended_attention_mask = (1.0 - extended_attention_mask) * -1e9
- scores = scores + extended_attention_mask
-
- scores = F.softmax(scores.float(), dim=-1).type_as(xq)
- scores = self.attn_dropout(scores)
- output = scores @ xv
-
- output = output.transpose(1, 2).reshape(bsz, seq_len, -1)
- output = self.resid_dropout(self.o_proj(output))
- return output, past_kv
-
-
-class FeedForward(nn.Module):
- def __init__(self, config: MiniMindConfig):
- super().__init__()
- if config.intermediate_size is None:
- intermediate_size = int(config.hidden_size * 8 / 3)
- config.intermediate_size = 64 * ((intermediate_size + 64 - 1) // 64)
- self.gate_proj = nn.Linear(config.hidden_size, config.intermediate_size, bias=False)
- self.down_proj = nn.Linear(config.intermediate_size, config.hidden_size, bias=False)
- self.up_proj = nn.Linear(config.hidden_size, config.intermediate_size, bias=False)
- self.dropout = nn.Dropout(config.dropout)
- self.act_fn = ACT2FN[config.hidden_act]
-
- def forward(self, x):
- return self.dropout(self.down_proj(self.act_fn(self.gate_proj(x)) * self.up_proj(x)))
-
-
-class MoEGate(nn.Module):
- def __init__(self, config: MiniMindConfig):
- super().__init__()
- self.config = config
- self.top_k = config.num_experts_per_tok
- self.n_routed_experts = config.n_routed_experts
-
- self.scoring_func = config.scoring_func
- self.alpha = config.aux_loss_alpha
- self.seq_aux = config.seq_aux
-
- self.norm_topk_prob = config.norm_topk_prob
- self.gating_dim = config.hidden_size
- self.weight = nn.Parameter(torch.empty((self.n_routed_experts, self.gating_dim)))
- self.reset_parameters()
-
- def reset_parameters(self) -> None:
- init.kaiming_uniform_(self.weight, a=math.sqrt(5))
-
- def forward(self, hidden_states):
- bsz, seq_len, h = hidden_states.shape
- hidden_states = hidden_states.view(-1, h)
- logits = F.linear(hidden_states, self.weight, None)
- if self.scoring_func == 'softmax':
- scores = logits.softmax(dim=-1)
- else:
- raise NotImplementedError(f'insupportable scoring function for MoE gating: {self.scoring_func}')
-
- topk_weight, topk_idx = torch.topk(scores, k=self.top_k, dim=-1, sorted=False)
-
- if self.top_k > 1 and self.norm_topk_prob:
- denominator = topk_weight.sum(dim=-1, keepdim=True) + 1e-20
- topk_weight = topk_weight / denominator
-
- if self.training and self.alpha > 0.0:
- scores_for_aux = scores
- aux_topk = self.top_k
- topk_idx_for_aux_loss = topk_idx.view(bsz, -1)
- if self.seq_aux:
- scores_for_seq_aux = scores_for_aux.view(bsz, seq_len, -1)
- ce = torch.zeros(bsz, self.n_routed_experts, device=hidden_states.device)
- ce.scatter_add_(1, topk_idx_for_aux_loss,
- torch.ones(bsz, seq_len * aux_topk, device=hidden_states.device)).div_(
- seq_len * aux_topk / self.n_routed_experts)
- aux_loss = (ce * scores_for_seq_aux.mean(dim=1)).sum(dim=1).mean() * self.alpha
- else:
- mask_ce = F.one_hot(topk_idx_for_aux_loss.view(-1), num_classes=self.n_routed_experts)
- ce = mask_ce.float().mean(0)
- Pi = scores_for_aux.mean(0)
- fi = ce * self.n_routed_experts
- aux_loss = (Pi * fi).sum() * self.alpha
- else:
- aux_loss = 0
- return topk_idx, topk_weight, aux_loss
-
-
-class MOEFeedForward(nn.Module):
- def __init__(self, config: MiniMindConfig):
- super().__init__()
- self.config = config
- self.experts = nn.ModuleList([
- FeedForward(config)
- for _ in range(config.n_routed_experts)
- ])
- self.gate = MoEGate(config)
- if config.n_shared_experts > 0:
- self.shared_experts = nn.ModuleList([
- FeedForward(config)
- for _ in range(config.n_shared_experts)
- ])
-
- def forward(self, x):
- identity = x
- orig_shape = x.shape
- bsz, seq_len, _ = x.shape
- # 使用门控机制选择专家
- topk_idx, topk_weight, aux_loss = self.gate(x)
- x = x.view(-1, x.shape[-1])
- flat_topk_idx = topk_idx.view(-1)
- if self.training:
- x = x.repeat_interleave(self.config.num_experts_per_tok, dim=0)
- y = torch.empty_like(x, dtype=torch.float16)
- for i, expert in enumerate(self.experts):
- y[flat_topk_idx == i] = expert(x[flat_topk_idx == i]).to(y.dtype) # 确保类型一致
- y = (y.view(*topk_weight.shape, -1) * topk_weight.unsqueeze(-1)).sum(dim=1)
- y = y.view(*orig_shape)
- else:
- y = self.moe_infer(x, flat_topk_idx, topk_weight.view(-1, 1)).view(*orig_shape)
- if self.config.n_shared_experts > 0:
- for expert in self.shared_experts:
- y = y + expert(identity)
- self.aux_loss = aux_loss
- return y
-
- @torch.no_grad()
- def moe_infer(self, x, flat_expert_indices, flat_expert_weights):
- expert_cache = torch.zeros_like(x)
- idxs = flat_expert_indices.argsort()
- tokens_per_expert = flat_expert_indices.bincount().cpu().numpy().cumsum(0)
- token_idxs = idxs // self.config.num_experts_per_tok
- # 当tokens_per_expert = [6, 15, 20, 26],tokens_per_expert.shape[0]即为专家数量(此时为4)
- # 且token_idxs = [3, 7, 19, 21, 24, 25, 4, 5, 6, 10, 11, 12...] 时
- # 意味token_idxs[:6] -> [3, 7, 19, 21, 24, 25]这6个位置属于专家0处理的token(每个token有可能被多个专家处理,这取决于num_experts_per_tok)
- # 接下来9个位置token_idxs[6:15] -> [4, 5, 6, 10, 11, 12...]属于专家1处理的token...依此类推
- for i, end_idx in enumerate(tokens_per_expert):
- start_idx = 0 if i == 0 else tokens_per_expert[i - 1]
- if start_idx == end_idx:
- continue
- expert = self.experts[i]
- exp_token_idx = token_idxs[start_idx:end_idx]
- expert_tokens = x[exp_token_idx]
- expert_out = expert(expert_tokens).to(expert_cache.dtype)
- expert_out.mul_(flat_expert_weights[idxs[start_idx:end_idx]])
- expert_cache.scatter_add_(0, exp_token_idx.view(-1, 1).repeat(1, x.shape[-1]), expert_out)
-
- return expert_cache
-
-
-class MiniMindBlock(nn.Module):
- def __init__(self, layer_id: int, config: MiniMindConfig):
- super().__init__()
- self.num_attention_heads = config.num_attention_heads
- self.hidden_size = config.hidden_size
- self.head_dim = config.hidden_size // config.num_attention_heads
- self.self_attn = Attention(config)
-
- self.layer_id = layer_id
- self.input_layernorm = RMSNorm(config.hidden_size, eps=config.rms_norm_eps)
- self.post_attention_layernorm = RMSNorm(config.hidden_size, eps=config.rms_norm_eps)
- self.mlp = FeedForward(config) if not config.use_moe else MOEFeedForward(config)
-
- def forward(self, hidden_states, position_embeddings, past_key_value=None, use_cache=False, attention_mask=None):
- residual = hidden_states
- hidden_states, present_key_value = self.self_attn(
- self.input_layernorm(hidden_states), position_embeddings,
- past_key_value, use_cache, attention_mask
- )
- hidden_states += residual
- hidden_states = hidden_states + self.mlp(self.post_attention_layernorm(hidden_states))
- return hidden_states, present_key_value
-
-
-class MiniMindModel(nn.Module):
- def __init__(self, config: MiniMindConfig):
- super().__init__()
- self.config = config
- self.vocab_size, self.num_hidden_layers = config.vocab_size, config.num_hidden_layers
- self.embed_tokens = nn.Embedding(config.vocab_size, config.hidden_size)
- self.dropout = nn.Dropout(config.dropout)
- self.layers = nn.ModuleList([MiniMindBlock(l, config) for l in range(self.num_hidden_layers)])
- self.norm = RMSNorm(config.hidden_size, eps=config.rms_norm_eps)
-
- freqs_cos, freqs_sin = precompute_freqs_cis(dim=config.hidden_size // config.num_attention_heads,
- end=config.max_position_embeddings, rope_base=config.rope_theta,
- rope_scaling=config.rope_scaling)
- self.register_buffer("freqs_cos", freqs_cos, persistent=False)
- self.register_buffer("freqs_sin", freqs_sin, persistent=False)
-
- def forward(self,
- input_ids: Optional[torch.Tensor] = None,
- attention_mask: Optional[torch.Tensor] = None,
- past_key_values: Optional[List[Tuple[torch.Tensor, torch.Tensor]]] = None,
- use_cache: bool = False,
- **kwargs):
- batch_size, seq_length = input_ids.shape
- if hasattr(past_key_values, 'layers'): past_key_values = None
- past_key_values = past_key_values or [None] * len(self.layers)
- start_pos = past_key_values[0][0].shape[1] if past_key_values[0] is not None else 0
-
- hidden_states = self.dropout(self.embed_tokens(input_ids))
-
- position_embeddings = (
- self.freqs_cos[start_pos:start_pos + seq_length],
- self.freqs_sin[start_pos:start_pos + seq_length]
- )
-
- presents = []
- for layer_idx, (layer, past_key_value) in enumerate(zip(self.layers, past_key_values)):
- hidden_states, present = layer(
- hidden_states,
- position_embeddings,
- past_key_value=past_key_value,
- use_cache=use_cache,
- attention_mask=attention_mask
- )
- presents.append(present)
-
- hidden_states = self.norm(hidden_states)
-
- aux_loss = sum(
- layer.mlp.aux_loss
- for layer in self.layers
- if isinstance(layer.mlp, MOEFeedForward)
- )
-
- return hidden_states, presents, aux_loss
-
-
-class MiniMindForCausalLM(PreTrainedModel, GenerationMixin):
- config_class = MiniMindConfig
-
- def __init__(self, config: MiniMindConfig = None):
- self.config = config or MiniMindConfig()
- super().__init__(self.config)
- self.model = MiniMindModel(self.config)
- self.lm_head = nn.Linear(self.config.hidden_size, self.config.vocab_size, bias=False)
- self.model.embed_tokens.weight = self.lm_head.weight
- self.OUT = CausalLMOutputWithPast()
-
- def forward(self,
- input_ids: Optional[torch.Tensor] = None,
- attention_mask: Optional[torch.Tensor] = None,
- past_key_values: Optional[List[Tuple[torch.Tensor, torch.Tensor]]] = None,
- use_cache: bool = False,
- logits_to_keep: Union[int, torch.Tensor] = 0,
- **args):
- h, past_kvs, aux_loss = self.model(
- input_ids=input_ids,
- attention_mask=attention_mask,
- past_key_values=past_key_values,
- use_cache=use_cache,
- **args
- )
- slice_indices = slice(-logits_to_keep, None) if isinstance(logits_to_keep, int) else logits_to_keep
- logits = self.lm_head(h[:, slice_indices, :])
- self.OUT.__setitem__('last_hidden_state', h)
- self.OUT.__setitem__('logits', logits)
- self.OUT.__setitem__('aux_loss', aux_loss)
- self.OUT.__setitem__('past_key_values', past_kvs)
- return self.OUT
diff --git a/model/model_vlm.py b/model/model_vlm.py
deleted file mode 100644
index f53f299..0000000
--- a/model/model_vlm.py
+++ /dev/null
@@ -1,169 +0,0 @@
-import os
-
-import torch
-import warnings
-from .model_minimind import *
-from typing import Optional, Tuple, List
-from torch import nn
-from transformers import CLIPProcessor, CLIPModel
-from typing import List
-
-warnings.filterwarnings('ignore')
-
-
-class VLMConfig(MiniMindConfig):
- model_type = "minimind-v"
-
- def __init__(
- self,
- image_special_token: str = '@' * 196,
- image_ids: List = [34] * 196,
- **kwargs,
- ):
- self.image_special_token = image_special_token
- self.image_ids = image_ids
- super().__init__(**kwargs)
-
-class VisionProj(nn.Module):
- def __init__(self, ve_hidden_size=768, hidden_size=512):
- super().__init__()
- self.ve_hidden_size = ve_hidden_size
- self.hidden_size = hidden_size
- self.vision_proj = nn.Sequential(
- nn.Linear(self.ve_hidden_size, self.hidden_size)
- )
-
- def forward(self, image_encoders):
- vision_proj = self.vision_proj(image_encoders)
- return vision_proj
-
-
-# 继承自语言模型
-class MiniMindVLM(MiniMindForCausalLM):
- config_class = VLMConfig
-
- def __init__(self, params: VLMConfig = None, vision_model_path="./model/vision_model/clip-vit-base-patch16"):
- super().__init__(params)
- if not params: params = VLMConfig()
- self.params = params
- self.vision_encoder, self.processor = self.__class__.get_vision_model(vision_model_path)
- self.vision_proj = VisionProj(hidden_size=params.hidden_size)
-
- @staticmethod
- def get_vision_model(model_path: str):
- from transformers import logging as hf_logging
- hf_logging.set_verbosity_error()
- if not os.path.exists(model_path):
- return None, None
- model = CLIPModel.from_pretrained(model_path)
- processor = CLIPProcessor.from_pretrained(model_path)
- # 冻结 vision_encoder 的所有参数
- for param in model.parameters():
- param.requires_grad = False
- return model.eval(), processor
-
- @staticmethod
- def image2tensor(image, processor):
- if image.mode in ['RGBA', 'LA']: image = image.convert('RGB')
- inputs = processor(images=image, return_tensors="pt")['pixel_values']
- return inputs
-
- @staticmethod
- def get_image_embeddings(image_tensors, vision_model):
- with torch.no_grad():
- outputs = vision_model.vision_model(pixel_values=image_tensors)
- img_embedding = outputs.last_hidden_state[:, 1:, :].squeeze()
- return img_embedding
-
- def count_vision_proj(self, tokens, h, vision_tensors=None, seqlen=512):
- def find_indices(tokens, image_ids):
- image_ids_tensor = torch.tensor(image_ids).to(tokens.device)
- len_image_ids = len(image_ids)
- if len_image_ids > tokens.size(1):
- return None
- tokens_view = tokens.unfold(1, len_image_ids, 1)
- matches = (tokens_view == image_ids_tensor).all(dim=2)
- return {
- batch_idx: [(idx.item(), idx.item() + len_image_ids - 1) for idx in
- matches[batch_idx].nonzero(as_tuple=True)[0]]
- for batch_idx in range(tokens.size(0)) if matches[batch_idx].any()
- } or None
-
- image_indices = find_indices(tokens, self.params.image_ids)
- if vision_tensors is not None and image_indices:
- vision_proj = self.vision_proj(vision_tensors)
- if len(vision_proj.shape) == 3:
- vision_proj = vision_proj.unsqueeze(0)
- new_h = []
- for i in range(h.size(0)):
- if i in image_indices:
- h_i = h[i]
- img_idx = 0
- for start_idx, end_idx in image_indices[i]:
- if img_idx < vision_proj.size(1):
- h_i = torch.cat((h_i[:start_idx], vision_proj[i][img_idx], h_i[end_idx + 1:]), dim=0)[
- :seqlen]
- img_idx += 1
- new_h.append(h_i)
- else:
- new_h.append(h[i])
- return torch.stack(new_h, dim=0)
- return h
-
- def forward(self,
- input_ids: Optional[torch.Tensor] = None,
- attention_mask: Optional[torch.Tensor] = None,
- past_key_values: Optional[List[Tuple[torch.Tensor, torch.Tensor]]] = None,
- use_cache: bool = False,
- logits_to_keep: Union[int, torch.Tensor] = 0,
- pixel_values: Optional[torch.FloatTensor] = None,
- **args):
- batch_size, seq_length = input_ids.shape
- if hasattr(past_key_values, 'layers'): past_key_values = None
- past_key_values = past_key_values or [None] * len(self.model.layers)
- start_pos = past_key_values[0][0].shape[1] if past_key_values[0] is not None else 0
-
- hidden_states = self.model.dropout(self.model.embed_tokens(input_ids))
-
- if pixel_values is not None and start_pos == 0:
- if len(pixel_values.shape) == 6:
- pixel_values = pixel_values.squeeze(2)
- bs, num, c, im_h, im_w = pixel_values.shape
- stack_dim = 1 if bs > 1 else 0
- vision_tensors = torch.stack([
- MiniMindVLM.get_image_embeddings(pixel_values[:, i, :, :, :], self.vision_encoder)
- for i in range(num)
- ], dim=stack_dim)
- hidden_states = self.count_vision_proj(tokens=input_ids, h=hidden_states, vision_tensors=vision_tensors,
- seqlen=input_ids.shape[1])
-
- position_embeddings = (
- self.model.freqs_cos[start_pos:start_pos + seq_length],
- self.model.freqs_sin[start_pos:start_pos + seq_length]
- )
-
- presents = []
- for layer_idx, (layer, past_key_value) in enumerate(zip(self.model.layers, past_key_values)):
- hidden_states, present = layer(
- hidden_states,
- position_embeddings,
- past_key_value=past_key_value,
- use_cache=use_cache,
- attention_mask=attention_mask
- )
- presents.append(present)
-
- hidden_states = self.model.norm(hidden_states)
-
- aux_loss = sum(
- layer.mlp.aux_loss
- for layer in self.model.layers
- if isinstance(layer.mlp, MOEFeedForward)
- )
- slice_indices = slice(-logits_to_keep, None) if isinstance(logits_to_keep, int) else logits_to_keep
- logits = self.lm_head(hidden_states[:, slice_indices, :])
- self.OUT.__setitem__('last_hidden_state', hidden_states)
- self.OUT.__setitem__('logits', logits)
- self.OUT.__setitem__('aux_loss', aux_loss)
- self.OUT.__setitem__('past_key_values', presents)
- return self.OUT
diff --git a/model/tokenizer.json b/model/tokenizer.json
deleted file mode 100644
index b255122..0000000
--- a/model/tokenizer.json
+++ /dev/null
@@ -1,31026 +0,0 @@
-{
- "version": "1.0",
- "truncation": null,
- "padding": null,
- "added_tokens": [
- {
- "id": 0,
- "content": "<|endoftext|>",
- "single_word": false,
- "lstrip": false,
- "rstrip": false,
- "normalized": false,
- "special": true
- },
- {
- "id": 1,
- "content": "<|im_start|>",
- "single_word": false,
- "lstrip": false,
- "rstrip": false,
- "normalized": false,
- "special": true
- },
- {
- "id": 2,
- "content": "<|im_end|>",
- "single_word": false,
- "lstrip": false,
- "rstrip": false,
- "normalized": false,
- "special": true
- }
- ],
- "normalizer": null,
- "pre_tokenizer": {
- "type": "ByteLevel",
- "add_prefix_space": false,
- "trim_offsets": true,
- "use_regex": true
- },
- "post_processor": null,
- "decoder": {
- "type": "ByteLevel",
- "add_prefix_space": true,
- "trim_offsets": true,
- "use_regex": true
- },
- "model": {
- "type": "BPE",
- "dropout": null,
- "unk_token": null,
- "continuing_subword_prefix": null,
- "end_of_word_suffix": null,
- "fuse_unk": false,
- "byte_fallback": false,
- "ignore_merges": false,
- "vocab": {
- "<|endoftext|>": 0,
- "<|im_start|>": 1,
- "<|im_end|>": 2,
- "!": 3,
- "\"": 4,
- "#": 5,
- "$": 6,
- "%": 7,
- "&": 8,
- "'": 9,
- "(": 10,
- ")": 11,
- "*": 12,
- "+": 13,
- ",": 14,
- "-": 15,
- ".": 16,
- "/": 17,
- "0": 18,
- "1": 19,
- "2": 20,
- "3": 21,
- "4": 22,
- "5": 23,
- "6": 24,
- "7": 25,
- "8": 26,
- "9": 27,
- ":": 28,
- ";": 29,
- "<": 30,
- "=": 31,
- ">": 32,
- "?": 33,
- "@": 34,
- "A": 35,
- "B": 36,
- "C": 37,
- "D": 38,
- "E": 39,
- "F": 40,
- "G": 41,
- "H": 42,
- "I": 43,
- "J": 44,
- "K": 45,
- "L": 46,
- "M": 47,
- "N": 48,
- "O": 49,
- "P": 50,
- "Q": 51,
- "R": 52,
- "S": 53,
- "T": 54,
- "U": 55,
- "V": 56,
- "W": 57,
- "X": 58,
- "Y": 59,
- "Z": 60,
- "[": 61,
- "\\": 62,
- "]": 63,
- "^": 64,
- "_": 65,
- "`": 66,
- "a": 67,
- "b": 68,
- "c": 69,
- "d": 70,
- "e": 71,
- "f": 72,
- "g": 73,
- "h": 74,
- "i": 75,
- "j": 76,
- "k": 77,
- "l": 78,
- "m": 79,
- "n": 80,
- "o": 81,
- "p": 82,
- "q": 83,
- "r": 84,
- "s": 85,
- "t": 86,
- "u": 87,
- "v": 88,
- "w": 89,
- "x": 90,
- "y": 91,
- "z": 92,
- "{": 93,
- "|": 94,
- "}": 95,
- "~": 96,
- "¡": 97,
- "¢": 98,
- "£": 99,
- "¤": 100,
- "¥": 101,
- "¦": 102,
- "§": 103,
- "¨": 104,
- "©": 105,
- "ª": 106,
- "«": 107,
- "¬": 108,
- "®": 109,
- "¯": 110,
- "°": 111,
- "±": 112,
- "²": 113,
- "³": 114,
- "´": 115,
- "µ": 116,
- "¶": 117,
- "·": 118,
- "¸": 119,
- "¹": 120,
- "º": 121,
- "»": 122,
- "¼": 123,
- "½": 124,
- "¾": 125,
- "¿": 126,
- "À": 127,
- "Á": 128,
- "Â": 129,
- "Ã": 130,
- "Ä": 131,
- "Å": 132,
- "Æ": 133,
- "Ç": 134,
- "È": 135,
- "É": 136,
- "Ê": 137,
- "Ë": 138,
- "Ì": 139,
- "Í": 140,
- "Î": 141,
- "Ï": 142,
- "Ð": 143,
- "Ñ": 144,
- "Ò": 145,
- "Ó": 146,
- "Ô": 147,
- "Õ": 148,
- "Ö": 149,
- "×": 150,
- "Ø": 151,
- "Ù": 152,
- "Ú": 153,
- "Û": 154,
- "Ü": 155,
- "Ý": 156,
- "Þ": 157,
- "ß": 158,
- "à": 159,
- "á": 160,
- "â": 161,
- "ã": 162,
- "ä": 163,
- "å": 164,
- "æ": 165,
- "ç": 166,
- "è": 167,
- "é": 168,
- "ê": 169,
- "ë": 170,
- "ì": 171,
- "í": 172,
- "î": 173,
- "ï": 174,
- "ð": 175,
- "ñ": 176,
- "ò": 177,
- "ó": 178,
- "ô": 179,
- "õ": 180,
- "ö": 181,
- "÷": 182,
- "ø": 183,
- "ù": 184,
- "ú": 185,
- "û": 186,
- "ü": 187,
- "ý": 188,
- "þ": 189,
- "ÿ": 190,
- "Ā": 191,
- "ā": 192,
- "Ă": 193,
- "ă": 194,
- "Ą": 195,
- "ą": 196,
- "Ć": 197,
- "ć": 198,
- "Ĉ": 199,
- "ĉ": 200,
- "Ċ": 201,
- "ċ": 202,
- "Č": 203,
- "č": 204,
- "Ď": 205,
- "ď": 206,
- "Đ": 207,
- "đ": 208,
- "Ē": 209,
- "ē": 210,
- "Ĕ": 211,
- "ĕ": 212,
- "Ė": 213,
- "ė": 214,
- "Ę": 215,
- "ę": 216,
- "Ě": 217,
- "ě": 218,
- "Ĝ": 219,
- "ĝ": 220,
- "Ğ": 221,
- "ğ": 222,
- "Ġ": 223,
- "ġ": 224,
- "Ģ": 225,
- "ģ": 226,
- "Ĥ": 227,
- "ĥ": 228,
- "Ħ": 229,
- "ħ": 230,
- "Ĩ": 231,
- "ĩ": 232,
- "Ī": 233,
- "ī": 234,
- "Ĭ": 235,
- "ĭ": 236,
- "Į": 237,
- "į": 238,
- "İ": 239,
- "ı": 240,
- "IJ": 241,
- "ij": 242,
- "Ĵ": 243,
- "ĵ": 244,
- "Ķ": 245,
- "ķ": 246,
- "ĸ": 247,
- "Ĺ": 248,
- "ĺ": 249,
- "Ļ": 250,
- "ļ": 251,
- "Ľ": 252,
- "ľ": 253,
- "Ŀ": 254,
- "ŀ": 255,
- "Ł": 256,
- "ł": 257,
- "Ń": 258,
- "Ġt": 259,
- "Ġa": 260,
- "in": 261,
- "he": 262,
- "re": 263,
- "ï¼": 264,
- "ä¸": 265,
- "on": 266,
- "at": 267,
- "çļ": 268,
- "çļĦ": 269,
- "ï¼Į": 270,
- "Ġs": 271,
- "Ġc": 272,
- "nd": 273,
- "ãĢ": 274,
- "er": 275,
- "Ġthe": 276,
- "es": 277,
- "en": 278,
- "or": 279,
- "an": 280,
- "Ġand": 281,
- "ing": 282,
- "Ġp": 283,
- "it": 284,
- "al": 285,
- "ãĢĤ": 286,
- "Ġo": 287,
- "Ġw": 288,
- "ä»": 289,
- "Ġto": 290,
- "is": 291,
- "ou": 292,
- "Ġm": 293,
- "äº": 294,
- "Ġin": 295,
- "Ġf": 296,
- "Ġb": 297,
- "ed": 298,
- "ion": 299,
- "åı": 300,
- "ic": 301,
- "Ġd": 302,
- "Ġof": 303,
- "le": 304,
- "ar": 305,
- "ro": 306,
- "ĠĠ": 307,
- "åħ": 308,
- "ent": 309,
- "æľ": 310,
- "Ġe": 311,
- "åĴ": 312,
- "è¿": 313,
- "ä½": 314,
- "åĴĮ": 315,
- "æĪ": 316,
- "å®": 317,
- "åĪ": 318,
- "ve": 319,
- "us": 320,
- "Ġre": 321,
- "Ġh": 322,
- "Ġth": 323,
- "as": 324,
- "ct": 325,
- "çĶ": 326,
- "om": 327,
- "åľ": 328,
- "å¤": 329,
- "æĺ": 330,
- "åĬ": 331,
- "åIJ": 332,
- "ä¸Ģ": 333,
- "im": 334,
- "è¯": 335,
- "æĸ": 336,
- "ation": 337,
- "lo": 338,
- "ç»": 339,
- "Ġbe": 340,
- "ãĢģ": 341,
- "id": 342,
- "Ġcan": 343,
- "il": 344,
- "æĺ¯": 345,
- "ä¹": 346,
- "è®": 347,
- "ĠA": 348,
- "Ġthat": 349,
- "ĠT": 350,
- "以": 351,
- "ch": 352,
- "Ġy": 353,
- "ce": 354,
- "ï¼ļ": 355,
- "ot": 356,
- "ers": 357,
- "Ġn": 358,
- "éĢ": 359,
- "ra": 360,
- "å°": 361,
- "Ġg": 362,
- "Ġyou": 363,
- "åŃ": 364,
- "Ġpro": 365,
- "et": 366,
- "åº": 367,
- "åľ¨": 368,
- "ly": 369,
- "Ġis": 370,
- "个": 371,
- "Ġl": 372,
- "ur": 373,
- "Ġfor": 374,
- "åı¯": 375,
- "éĩ": 376,
- "st": 377,
- "çļĦæ": 378,
- "ut": 379,
- "Ġhe": 380,
- "if": 381,
- "ĥ½": 382,
- "ä¼": 383,
- "ĠI": 384,
- "è¡": 385,
- "ir": 386,
- "ith": 387,
- "å¹": 388,
- "Ġare": 389,
- "ig": 390,
- "Ġst": 391,
- "el": 392,
- "ol": 393,
- "å¸": 394,
- "ul": 395,
- "æĿ": 396,
- "æĪij": 397,
- "Ġon": 398,
- "è¦": 399,
- "æľī": 400,
- "æĹ": 401,
- "å¯": 402,
- "è§": 403,
- "è¦ģ": 404,
- "Ġus": 405,
- "ay": 406,
- "æķ": 407,
- "çī": 408,
- "ow": 409,
- "ment": 410,
- "ç͍": 411,
- "ess": 412,
- "ä¸Ń": 413,
- "们": 414,
- "人": 415,
- "åĩ": 416,
- "Ġex": 417,
- "ĠĠĠĠ": 418,
- "åĽ": 419,
- "åĮ": 420,
- "å¼": 421,
- "Ġcon": 422,
- "se": 423,
- "èĥ½": 424,
- "çİ": 425,
- "Ġan": 426,
- "Ġwith": 427,
- "为": 428,
- "ate": 429,
- "iv": 430,
- "am": 431,
- "Ġas": 432,
- "ure": 433,
- "è¿Ļ": 434,
- "åĨ": 435,
- "çŃ": 436,
- "Ġor": 437,
- "å·": 438,
- "Ġal": 439,
- "ies": 440,
- "ç§": 441,
- "Ġim": 442,
- "æĢ": 443,
- "ver": 444,
- "ab": 445,
- "äºĨ": 446,
- "Ġsu": 447,
- "Ġde": 448,
- "ge": 449,
- "th": 450,
- "åı¯ä»¥": 451,
- "èĢ": 452,
- "ä¸į": 453,
- "å¾": 454,
- "ĠAI": 455,
- "Ġen": 456,
- "éĹ": 457,
- "æī": 458,
- "ak": 459,
- "ive": 460,
- "Ġmo": 461,
- "å¥": 462,
- "éĿ": 463,
- "çĽ": 464,
- "ity": 465,
- "ä¿": 466,
- "un": 467,
- "è´": 468,
- "åį": 469,
- "Ġit": 470,
- "Ġimp": 471,
- "ect": 472,
- "æł": 473,
- "å½": 474,
- "èĩ": 475,
- "é¢": 476,
- "åĵ": 477,
- "æ³": 478,
- "ort": 479,
- "ad": 480,
- "æŀ": 481,
- "em": 482,
- "Ġcom": 483,
- "å¦": 484,
- "her": 485,
- "ere": 486,
- "ĠS": 487,
- "ial": 488,
- "ĠC": 489,
- "ĠThe": 490,
- "çIJ": 491,
- "çĶŁ": 492,
- "æĦ": 493,
- "pp": 494,
- "æŃ": 495,
- "æĸ¹": 496,
- "qu": 497,
- "Ġwh": 498,
- "å¦Ĥ": 499,
- "éľ": 500,
- "ant": 501,
- "Ġle": 502,
- "Ġv": 503,
- "æĭ": 504,
- "æĬ": 505,
- "ust": 506,
- "æĹ¶": 507,
- "çŃī": 508,
- "åij": 509,
- "对": 510,
- "ter": 511,
- "ld": 512,
- "è¡Į": 513,
- "Ġch": 514,
- "ud": 515,
- "éľĢ": 516,
- "æ°": 517,
- "æĪIJ": 518,
- "Ġ|": 519,
- "ac": 520,
- "ain": 521,
- "iz": 522,
- "æı": 523,
- "ions": 524,
- "Ġha": 525,
- "æĽ": 526,
- "--": 527,
- "æĿ¥": 528,
- "ome": 529,
- "å¿": 530,
- "'s": 531,
- "Ġne": 532,
- "est": 533,
- "ä¾": 534,
- "um": 535,
- "åΰ": 536,
- "åľ°": 537,
- "ist": 538,
- "âĢ": 539,
- "çī©": 540,
- "ä¸Ģ个": 541,
- "lp": 542,
- "æİ": 543,
- "èĩª": 544,
- "Ġhelp": 545,
- "Ġtheir": 546,
- "æĶ": 547,
- "ä½ľ": 548,
- "ä¼ļ": 549,
- "æĮ": 550,
- "æĪij们": 551,
- "nt": 552,
- "äºİ": 553,
- "åĪĨ": 554,
- "res": 555,
- "pe": 556,
- "åĩº": 557,
- "ide": 558,
- "æĥ": 559,
- "ĠH": 560,
- "è¾": 561,
- "ĠM": 562,
- "ff": 563,
- "æ¯": 564,
- "od": 565,
- "ical": 566,
- "Ġwor": 567,
- "ä¸Ĭ": 568,
- "are": 569,
- "æĽ´": 570,
- "Ġyour": 571,
- "ä¸ĭ": 572,
- "èµ": 573,
- "ations": 574,
- "æķ°": 575,
- "Ġte": 576,
- "åİ": 577,
- "çIJĨ": 578,
- "ĠTh": 579,
- "è¿ĩ": 580,
- "å¹¶": 581,
- "du": 582,
- "éĿ¢": 583,
- "Ġad": 584,
- "ill": 585,
- "æµ": 586,
- "好": 587,
- "oc": 588,
- "act": 589,
- "éľĢè¦ģ": 590,
- "ä»ĸ": 591,
- "å±": 592,
- "Ġr": 593,
- "Ġmore": 594,
- "åѦ": 595,
- "ç®": 596,
- "igh": 597,
- "äºĽ": 598,
- "ĠB": 599,
- "åĬ¨": 600,
- "åĵģ": 601,
- "èī": 602,
- "ple": 603,
- "Ġinc": 604,
- "åIJĮ": 605,
- "Ġexp": 606,
- "ould": 607,
- "ä½ł": 608,
- "æį": 609,
- "æıIJ": 610,
- "大": 611,
- "çݰ": 612,
- "pt": 613,
- "ĠP": 614,
- "all": 615,
- "åĬł": 616,
- "ç§į": 617,
- "Ġse": 618,
- "åĬĽ": 619,
- "out": 620,
- "Ġhave": 621,
- "çº": 622,
- "ä½ĵ": 623,
- "Ġprov": 624,
- "åĮĸ": 625,
- "å¤ļ": 626,
- "å®ļ": 627,
- "Ġused": 628,
- "éĢļ": 629,
- "cc": 630,
- "è¿Ľ": 631,
- "æ´": 632,
- "Ġsh": 633,
- "Ġab": 634,
- "os": 635,
- "Ġres": 636,
- "ĠThis": 637,
- "ç¨": 638,
- "æĢ§": 639,
- "age": 640,
- "ri": 641,
- "æ¸": 642,
- "able": 643,
- "åŃIJ": 644,
- "Ġby": 645,
- "åıij": 646,
- "éĩı": 647,
- "åºĶ": 648,
- "Ġlo": 649,
- "使": 650,
- "åħ¶": 651,
- "é«": 652,
- "éĻ": 653,
- "é«ĺ": 654,
- "度": 655,
- "è§£": 656,
- "é£": 657,
- "å°Ĩ": 658,
- "æ³ķ": 659,
- "and": 660,
- "ä¿Ŀ": 661,
- "ans": 662,
- "for": 663,
- "rom": 664,
- "reat": 665,
- "Ġpl": 666,
- "çļĦç": 667,
- "常": 668,
- "è½": 669,
- "Ġwe": 670,
- "表": 671,
- "ake": 672,
- "æĪĸ": 673,
- "é¢ĺ": 674,
- "åŁ": 675,
- "Ġme": 676,
- "æĸĩ": 677,
- "ther": 678,
- "ke": 679,
- "å®¶": 680,
- "åIJĪ": 681,
- "æľĢ": 682,
- "ine": 683,
- "Ġsome": 684,
- "ç±": 685,
- "éĩį": 686,
- "æŀľ": 687,
- "ĠW": 688,
- "ĠE": 689,
- "éĺ": 690,
- "our": 691,
- "rou": 692,
- "çĤ": 693,
- "æ±": 694,
- "åħ³": 695,
- "Ġint": 696,
- "ance": 697,
- "ä¹Ł": 698,
- "éģ": 699,
- "ĠĠĠ": 700,
- "å®ĥ": 701,
- "ag": 702,
- "æ¬": 703,
- "00": 704,
- "è°": 705,
- "ult": 706,
- "yst": 707,
- "éĹ´": 708,
- "ç³": 709,
- "Ġtr": 710,
- "pl": 711,
- "art": 712,
- "æĦŁ": 713,
- "æĤ": 714,
- "ata": 715,
- "ĠF": 716,
- "form": 717,
- "计": 718,
- "Ġfrom": 719,
- "ĠD": 720,
- "éĹ®": 721,
- "ight": 722,
- "ces": 723,
- "æį®": 724,
- "lop": 725,
- "ä¹ĭ": 726,
- "Ġfe": 727,
- "åģ": 728,
- "velop": 729,
- "Ġ1": 730,
- "åĽł": 731,
- "ks": 732,
- "æ²": 733,
- "Ġu": 734,
- "å°ı": 735,
- "ystem": 736,
- "Ġdis": 737,
- "ĠR": 738,
- "gy": 739,
- "å·¥": 740,
- "ç¨ĭ": 741,
- "å¢": 742,
- "ence": 743,
- "èĤ": 744,
- "ç¡": 745,
- "Ġtra": 746,
- "å»": 747,
- "åħ¥": 748,
- "ign": 749,
- "alth": 750,
- "Ġsuch": 751,
- "ach": 752,
- "æĻ": 753,
- "arn": 754,
- "Ġdata": 755,
- "è¶": 756,
- "å®ŀ": 757,
- "so": 758,
- "Ġdevelop": 759,
- "ç¤": 760,
- "Ġacc": 761,
- "ast": 762,
- "èĢĮ": 763,
- "Ġ\"": 764,
- "Ġother": 765,
- "建": 766,
- "Ġeff": 767,
- "ç«": 768,
- "Ġman": 769,
- "åħ¬": 770,
- "åĢ": 771,
- "çĦ": 772,
- "ms": 773,
- "å¼ı": 774,
- "èī²": 775,
- "å¾Ĺ": 776,
- "ific": 777,
- "Ġj": 778,
- "Ġro": 779,
- "Ġhas": 780,
- "chn": 781,
- "olo": 782,
- "åζ": 783,
- "èĬ": 784,
- "使ç͍": 785,
- "ous": 786,
- "ual": 787,
- "Ġat": 788,
- "Ġem": 789,
- "ell": 790,
- "Ġsystem": 791,
- "Ġhealth": 792,
- "ities": 793,
- "Ġexam": 794,
- "ib": 795,
- "éĶ": 796,
- "Ġabout": 797,
- "产": 798,
- "åIJİ": 799,
- "æĦı": 800,
- "ç±»": 801,
- "Ġpre": 802,
- "æĤ¨": 803,
- "Ġalso": 804,
- "ents": 805,
- "Ġind": 806,
- "ind": 807,
- "éĢĤ": 808,
- "Ġtechn": 809,
- "ress": 810,
- "æĥħ": 811,
- "éĹ®é¢ĺ": 812,
- "Ġuse": 813,
- "ï¼Ł": 814,
- "Ġincl": 815,
- "Ġspe": 816,
- "ich": 817,
- "ps": 818,
- "æľº": 819,
- "Ġthey": 820,
- "ie": 821,
- "Ġhow": 822,
- "Ġwork": 823,
- "ä¸ļ": 824,
- "ç´": 825,
- "Ġimpro": 826,
- "Ġlearn": 827,
- "æĸ°": 828,
- "çĤ¹": 829,
- "Ġcont": 830,
- "ard": 831,
- "çĦ¶": 832,
- "æľ¬": 833,
- "ç³»": 834,
- "ç¡®": 835,
- "设": 836,
- "åħ·": 837,
- "éĢī": 838,
- "èĢħ": 839,
- "éħ": 840,
- "gh": 841,
- "__": 842,
- "Ġnot": 843,
- "çľ": 844,
- "缸": 845,
- "Ġprovide": 846,
- "åī": 847,
- "ional": 848,
- "Ġens": 849,
- "ä¸İ": 850,
- "è´¨": 851,
- "ential": 852,
- "ç»ı": 853,
- "å¿ĥ": 854,
- "ang": 855,
- "æŃ¤": 856,
- "end": 857,
- "Ġpo": 858,
- "è¿Ľè¡Į": 859,
- "ice": 860,
- "Ġ-": 861,
- "Ġway": 862,
- "å·±": 863,
- "Ġ2": 864,
- "ime": 865,
- "ç½": 866,
- "èĩªå·±": 867,
- "Ġun": 868,
- "bot": 869,
- "Ġinclud": 870,
- "ated": 871,
- "æ°´": 872,
- "éķ": 873,
- "æĮģ": 874,
- "代": 875,
- "é¡": 876,
- "æīĢ": 877,
- "çĿ": 878,
- "pport": 879,
- "ood": 880,
- "ike": 881,
- "ru": 882,
- "Ġcomm": 883,
- "ĠL": 884,
- "ä¿¡": 885,
- "ĠG": 886,
- "çŁ": 887,
- "ç͵": 888,
- "Ġwas": 889,
- "low": 890,
- "erv": 891,
- "åĮħ": 892,
- "ĠĠĠĠĠĠĠĠ": 893,
- "Ġwhe": 894,
- "dit": 895,
- "Ġwhich": 896,
- "Ġcomp": 897,
- "éª": 898,
- "ore": 899,
- "ç¾": 900,
- "Ġ=": 901,
- "çī¹": 902,
- "iff": 903,
- "ert": 904,
- "æģ": 905,
- "rit": 906,
- "Ġrec": 907,
- "åĨħ": 908,
- "æĺİ": 909,
- "ors": 910,
- "Ġpat": 911,
- "----": 912,
- "æŁ": 913,
- "Ġapp": 914,
- "ns": 915,
- "åĬ¡": 916,
- "aly": 917,
- "ace": 918,
- "æ´»": 919,
- "ä¾Ľ": 920,
- "av": 921,
- "主": 922,
- "Ġpers": 923,
- "çĥ": 924,
- "该": 925,
- "Ġmy": 926,
- "ç©": 927,
- "eri": 928,
- "让": 929,
- "æĬĢ": 930,
- "éķ¿": 931,
- "ack": 932,
- "ĠN": 933,
- "Ġdiff": 934,
- "Ġthis": 935,
- "åĿ": 936,
- "Ġensure": 937,
- "å½ĵ": 938,
- "Ġout": 939,
- "Ġcl": 940,
- "Ġk": 941,
- "é¦": 942,
- "ount": 943,
- "çݯ": 944,
- "åĬ©": 945,
- "Ġtechnolo": 946,
- "Ġthese": 947,
- "ful": 948,
- "éļ": 949,
- "æ·": 950,
- "ä¸ĢäºĽ": 951,
- "Ġsoc": 952,
- "å¼Ģ": 953,
- "天": 954,
- "Ġev": 955,
- "Ġredu": 956,
- "Ġthem": 957,
- "Ġ(": 958,
- "éĥ½": 959,
- "æĪ·": 960,
- "è·": 961,
- "åľº": 962,
- "æ°Ķ": 963,
- "ĠY": 964,
- "è¯Ń": 965,
- "éĢļè¿ĩ": 966,
- "å±ķ": 967,
- "Ġco": 968,
- "å½±": 969,
- "ç¬": 970,
- "Ġanaly": 971,
- "æ¯Ķ": 972,
- "åħ¨": 973,
- "Ġimprove": 974,
- "ç»ĵ": 975,
- "å¹´": 976,
- "çķ": 977,
- "çĿĢ": 978,
- "Ġhum": 979,
- "Ġqu": 980,
- "ç®Ĺ": 981,
- "ĠO": 982,
- "é£Ł": 983,
- "ility": 984,
- "Ġsystems": 985,
- "åıĺ": 986,
- "ail": 987,
- "ç¼": 988,
- "çł": 989,
- "è¿Ļ个": 990,
- "æıIJä¾Ľ": 991,
- "ase": 992,
- "åŀ": 993,
- "ments": 994,
- "Ġpot": 995,
- "Ġany": 996,
- "ä½Ĩ": 997,
- "Ġcons": 998,
- "ĠIt": 999,
- "æł¼": 1000,
- "Ġar": 1001,
- "æľ¯": 1002,
- "éĿŀ": 1003,
- "Ġdo": 1004,
- "Ġmay": 1005,
- "æĭ©": 1006,
- "ue": 1007,
- "éĢīæĭ©": 1008,
- "ry": 1009,
- "éĥ": 1010,
- "Ġlike": 1011,
- "ong": 1012,
- "èģ": 1013,
- "``": 1014,
- "ile": 1015,
- "æ±Ĥ": 1016,
- "Ġnew": 1017,
- "ient": 1018,
- "Ġimpact": 1019,
- "è¿ĺ": 1020,
- "注": 1021,
- "ä¹Ī": 1022,
- "缮": 1023,
- "âĢľ": 1024,
- "âĢĿ": 1025,
- "ef": 1026,
- "ä¾ĭ": 1027,
- "Ġpotential": 1028,
- "ok": 1029,
- "åı¯èĥ½": 1030,
- "Ġtrans": 1031,
- "Ġact": 1032,
- "ï¼ī": 1033,
- "Ġspec": 1034,
- "æ¶": 1035,
- "Ġwill": 1036,
- "交": 1037,
- "ize": 1038,
- "ç¾İ": 1039,
- "å¸Ĥ": 1040,
- "Ġstud": 1041,
- "pon": 1042,
- "èº": 1043,
- "ä¸įåIJĮ": 1044,
- "one": 1045,
- "å¾Ī": 1046,
- "åıĬ": 1047,
- "å¦Ĥæŀľ": 1048,
- "çIJĥ": 1049,
- "ange": 1050,
- "Ġneed": 1051,
- "å¤ĸ": 1052,
- "ety": 1053,
- "aking": 1054,
- "请": 1055,
- "ater": 1056,
- "Ġperson": 1057,
- "ident": 1058,
- "Ġso": 1059,
- "Ġmake": 1060,
- "å¹³": 1061,
- "å¤Ł": 1062,
- "身": 1063,
- "ï¼Ī": 1064,
- "Ġinform": 1065,
- "æ¡": 1066,
- "äºĭ": 1067,
- "åıĹ": 1068,
- "ased": 1069,
- "ild": 1070,
- "Ġoff": 1071,
- "Ġthere": 1072,
- "cis": 1073,
- "è¢": 1074,
- "éĥ¨": 1075,
- "æ¯ı": 1076,
- "ract": 1077,
- "ass": 1078,
- "Ġlearning": 1079,
- "åĸ": 1080,
- "å½¢": 1081,
- "ire": 1082,
- "ä»İ": 1083,
- "bots": 1084,
- "èĻ": 1085,
- "帮": 1086,
- "Ġdes": 1087,
- "ĠIn": 1088,
- "cess": 1089,
- "Ġpe": 1090,
- "ify": 1091,
- "Ġwho": 1092,
- "ä¹ł": 1093,
- "æľŁ": 1094,
- "Ġexperi": 1095,
- "éĤ": 1096,
- "Ġsc": 1097,
- "ep": 1098,
- "ä½ķ": 1099,
- "Ġtime": 1100,
- "éĿŀ常": 1101,
- "æĭ¬": 1102,
- "åķ": 1103,
- "以ä¸ĭ": 1104,
- "éģĵ": 1105,
- "Ġcommun": 1106,
- "Ġcould": 1107,
- "ap": 1108,
- "èIJ": 1109,
- "è°ĥ": 1110,
- "lic": 1111,
- "duct": 1112,
- "Ġits": 1113,
- "cy": 1114,
- "说": 1115,
- "Ġmed": 1116,
- "Ġcol": 1117,
- "ular": 1118,
- "éĩįè¦ģ": 1119,
- "Ġsp": 1120,
- "åĪ©": 1121,
- "èµ·": 1122,
- "Ġprovid": 1123,
- "ices": 1124,
- "åĻ": 1125,
- "æĸĻ": 1126,
- "Ġimport": 1127,
- "ural": 1128,
- "åŃĹ": 1129,
- "Ġund": 1130,
- "int": 1131,
- "Ġover": 1132,
- "åı¸": 1133,
- "æł¹": 1134,
- "é¥": 1135,
- "ples": 1136,
- "ä»ĸ们": 1137,
- "gra": 1138,
- "uring": 1139,
- "now": 1140,
- "åįķ": 1141,
- "è¿ĻäºĽ": 1142,
- "åīį": 1143,
- "å®ī": 1144,
- "Ġpr": 1145,
- "åĮħæĭ¬": 1146,
- "ç»Ļ": 1147,
- "The": 1148,
- "ä½į": 1149,
- "å§": 1150,
- "ç´ł": 1151,
- "åijĺ": 1152,
- "Ġident": 1153,
- "åŀĭ": 1154,
- "Ġadd": 1155,
- "强": 1156,
- "æĺ¯ä¸Ģ": 1157,
- "ip": 1158,
- "gor": 1159,
- "Ġsupport": 1160,
- "ne": 1161,
- "Ġdiffere": 1162,
- "åħĥ": 1163,
- "Ġass": 1164,
- "åĨ³": 1165,
- "éĽ": 1166,
- "åIJį": 1167,
- "Ġgo": 1168,
- "Ġtechnology": 1169,
- "æĢ»": 1170,
- "è®®": 1171,
- "Ġinter": 1172,
- "Ġinv": 1173,
- "Ġour": 1174,
- "æķĪ": 1175,
- "ustom": 1176,
- "Ġrel": 1177,
- "ife": 1178,
- "åύ": 1179,
- "ings": 1180,
- "ä»·": 1181,
- "Ġpart": 1182,
- "被": 1183,
- "æīĭ": 1184,
- "ary": 1185,
- "Ġrespon": 1186,
- "ĊĠĠĠ": 1187,
- "好çļĦ": 1188,
- "ative": 1189,
- "帮åĬ©": 1190,
- "绣": 1191,
- "æĶ¾": 1192,
- "ĠHere": 1193,
- "çģ": 1194,
- "Ġbut": 1195,
- "æģ¯": 1196,
- "æŃ£": 1197,
- "ark": 1198,
- "åħ¬åı¸": 1199,
- "ory": 1200,
- "å¢ĥ": 1201,
- "lect": 1202,
- "éŁ": 1203,
- "æĥ³": 1204,
- "é£İ": 1205,
- "ating": 1206,
- "Ġam": 1207,
- "its": 1208,
- "æ»": 1209,
- "gorith": 1210,
- "åĵį": 1211,
- "ures": 1212,
- "Ġeffect": 1213,
- "Ġshould": 1214,
- "Ġper": 1215,
- "è±": 1216,
- "ç²": 1217,
- "ict": 1218,
- "Ġalgorith": 1219,
- "uc": 1220,
- "rough": 1221,
- "ä»»": 1222,
- "ä»¶": 1223,
- "Ġbet": 1224,
- "ia": 1225,
- "Ġanalyz": 1226,
- "æł¹æį®": 1227,
- "ized": 1228,
- "æµģ": 1229,
- "è§Ĥ": 1230,
- "è£": 1231,
- "æłĩ": 1232,
- "iron": 1233,
- "Ġcustom": 1234,
- "Ġreg": 1235,
- "Ġpersonal": 1236,
- "èĥ½å¤Ł": 1237,
- "ics": 1238,
- "ivid": 1239,
- "çĪ": 1240,
- "èµĦ": 1241,
- "æŃ¥": 1242,
- "容": 1243,
- "åĪĽ": 1244,
- "èĪ": 1245,
- "ä¹IJ": 1246,
- "导": 1247,
- "gan": 1248,
- "èĬĤ": 1249,
- "Ġall": 1250,
- "ens": 1251,
- "ame": 1252,
- "ness": 1253,
- "Ġup": 1254,
- "ĠU": 1255,
- "èĢĥ": 1256,
- "elf": 1257,
- "å̼": 1258,
- "å°ij": 1259,
- "æľį": 1260,
- "ari": 1261,
- "thical": 1262,
- "viron": 1263,
- "èĥ": 1264,
- "ord": 1265,
- "Ġsign": 1266,
- "éĩĮ": 1267,
- "ound": 1268,
- "ople": 1269,
- "åŁº": 1270,
- "Ġinformation": 1271,
- "Ġidentify": 1272,
- "åĽŀ": 1273,
- "Ġcre": 1274,
- "éŁ³": 1275,
- "ible": 1276,
- "ub": 1277,
- "è¿IJ": 1278,
- "Ġlead": 1279,
- "游": 1280,
- "次": 1281,
- "åĨĻ": 1282,
- "éĤ£": 1283,
- "get": 1284,
- "èį": 1285,
- "Ġexample": 1286,
- "ä¼ĺ": 1287,
- "å½±åĵį": 1288,
- "ish": 1289,
- "xt": 1290,
- "æº": 1291,
- "éªĮ": 1292,
- "ob": 1293,
- "客": 1294,
- "å¤ĩ": 1295,
- "åģ¥": 1296,
- "车": 1297,
- "社": 1298,
- "ividual": 1299,
- "ered": 1300,
- "les": 1301,
- "Ġenviron": 1302,
- "Ġpeople": 1303,
- "æĺŁ": 1304,
- "çĸ": 1305,
- "çĭ": 1306,
- "Ġdet": 1307,
- "æĹł": 1308,
- "Ġif": 1309,
- "ose": 1310,
- "ite": 1311,
- "å¢ŀ": 1312,
- "éĴ": 1313,
- "åIJĮæĹ¶": 1314,
- "è¿°": 1315,
- "æĸ¹å¼ı": 1316,
- "åĽ½": 1317,
- "é»": 1318,
- "å¤Ħ": 1319,
- "Ġexamples": 1320,
- "æ®": 1321,
- "Ġinto": 1322,
- "æĮĩ": 1323,
- "Ġhuman": 1324,
- "åIJij": 1325,
- "示": 1326,
- "æķ°æį®": 1327,
- "Ġ3": 1328,
- "ĠJ": 1329,
- "èı": 1330,
- "çݯå¢ĥ": 1331,
- "als": 1332,
- "erst": 1333,
- "Ġethical": 1334,
- "ç»Ħ": 1335,
- "ä¼ł": 1336,
- "Ġdifferent": 1337,
- "Ġknow": 1338,
- "åºı": 1339,
- "Ġindividual": 1340,
- "æıIJé«ĺ": 1341,
- "round": 1342,
- "å°±": 1343,
- "åıĸ": 1344,
- "åŃĺ": 1345,
- "两": 1346,
- "çŁ¥": 1347,
- "ources": 1348,
- "ck": 1349,
- "å£": 1350,
- "ines": 1351,
- "è¾¾": 1352,
- "Ġmany": 1353,
- "æķ´": 1354,
- "æł·": 1355,
- "ditional": 1356,
- "omm": 1357,
- "çͱ": 1358,
- "éĢł": 1359,
- "å®ĥ们": 1360,
- "ues": 1361,
- "Ġment": 1362,
- "Ġimportant": 1363,
- "Ġopt": 1364,
- "Ġloc": 1365,
- "ph": 1366,
- "Ġprocess": 1367,
- "Ġalgorithms": 1368,
- "设计": 1369,
- "Ġsocial": 1370,
- "very": 1371,
- "åĪĻ": 1372,
- "ä¾ĭå¦Ĥ": 1373,
- "认": 1374,
- "Ġaut": 1375,
- "Ġserv": 1376,
- "gg": 1377,
- "产åĵģ": 1378,
- "è§Ħ": 1379,
- "çľĭ": 1380,
- "vel": 1381,
- "æĸ¹æ³ķ": 1382,
- "Ġben": 1383,
- "åĽłæŃ¤": 1384,
- "care": 1385,
- "per": 1386,
- "åĬŁ": 1387,
- "建议": 1388,
- "Ġpos": 1389,
- "æ¤": 1390,
- "we": 1391,
- "åĮº": 1392,
- "iqu": 1393,
- "Ġreal": 1394,
- "æĹ¥": 1395,
- "Ġreduce": 1396,
- "af": 1397,
- "angu": 1398,
- "Ġsk": 1399,
- "Ġed": 1400,
- "erstand": 1401,
- "åĨµ": 1402,
- "mot": 1403,
- "åħĪ": 1404,
- "ç¥": 1405,
- "åºĶ该": 1406,
- "Ġthrough": 1407,
- "Ġconc": 1408,
- "åıijå±ķ": 1409,
- "è¯ķ": 1410,
- "æ¡Ī": 1411,
- "Ġenvironment": 1412,
- "åı£": 1413,
- "Ġadv": 1414,
- "åĪ«": 1415,
- "Ġbenef": 1416,
- "æ¸ħ": 1417,
- "åij³": 1418,
- "åħī": 1419,
- "Ġdevelopment": 1420,
- "eng": 1421,
- "å¦Ĥä½ķ": 1422,
- "管": 1423,
- "ivers": 1424,
- "åIJĦ": 1425,
- "Ġris": 1426,
- "row": 1427,
- "ergy": 1428,
- "计ç®Ĺ": 1429,
- "ä¿¡æģ¯": 1430,
- "Ġproduct": 1431,
- "è¾ĥ": 1432,
- "论": 1433,
- "èĩªå·±çļĦ": 1434,
- "æĬ¤": 1435,
- "åıį": 1436,
- "åħ¶ä»ĸ": 1437,
- "åĪĹ": 1438,
- "ç»Ĩ": 1439,
- "空": 1440,
- "Ġgreat": 1441,
- "ear": 1442,
- "æºIJ": 1443,
- "ject": 1444,
- "çĶŁæ´»": 1445,
- "ä¸ŃçļĦ": 1446,
- "Ġunderstand": 1447,
- "èĭ": 1448,
- "hat": 1449,
- "Ġprogra": 1450,
- "çĬ": 1451,
- "éĩij": 1452,
- "Ġincluding": 1453,
- "Ġaccess": 1454,
- "ĠĠĠĠĠĠĠ": 1455,
- "è¯Ĩ": 1456,
- "ç¦": 1457,
- "og": 1458,
- "è£ħ": 1459,
- "Ġart": 1460,
- "Ġwrit": 1461,
- "Ġincre": 1462,
- "Ġph": 1463,
- "æĸ¹éĿ¢": 1464,
- "Ġpract": 1465,
- "Ġusing": 1466,
- "项": 1467,
- "æİ¥": 1468,
- "Ġways": 1469,
- "Ġlangu": 1470,
- "æĶ¯": 1471,
- "Ġchall": 1472,
- "åİ»": 1473,
- "____": 1474,
- "imate": 1475,
- "æĸŃ": 1476,
- "è¨": 1477,
- "Ġwell": 1478,
- "ll": 1479,
- "Ġpol": 1480,
- "æĢģ": 1481,
- "Ġra": 1482,
- "Can": 1483,
- "åİŁ": 1484,
- "ber": 1485,
- "è¨Ģ": 1486,
- "ç«ĭ": 1487,
- "Ġgen": 1488,
- "éħį": 1489,
- "æ·±": 1490,
- "te": 1491,
- "ä¸ī": 1492,
- "ç§ij": 1493,
- "ĠFor": 1494,
- "线": 1495,
- "çħ": 1496,
- "æ¼": 1497,
- "åķĨ": 1498,
- "æĿIJ": 1499,
- "Ġsignific": 1500,
- "Ġgu": 1501,
- "Ġdecis": 1502,
- "Ġtrain": 1503,
- "Ġag": 1504,
- "Ġcreat": 1505,
- "å®Į": 1506,
- "æĹ¶éĹ´": 1507,
- "Ġone": 1508,
- "èĦ": 1509,
- "Ġnat": 1510,
- "åŃ¦ä¹ł": 1511,
- "çļĦæķ": 1512,
- "ced": 1513,
- "Ġwhen": 1514,
- "Ġbi": 1515,
- "èİ": 1516,
- "æĽ´åĬł": 1517,
- "ives": 1518,
- "port": 1519,
- "å·¥ä½ľ": 1520,
- "ving": 1521,
- "Ġbeen": 1522,
- "æĻº": 1523,
- "Ġlife": 1524,
- "å¼ķ": 1525,
- "arm": 1526,
- "çİĩ": 1527,
- "ç͍æĪ·": 1528,
- "ä¹ī": 1529,
- "份": 1530,
- "è¯Ŀ": 1531,
- "iness": 1532,
- "com": 1533,
- "康": 1534,
- "åĩı": 1535,
- "ä»Ģ": 1536,
- "è¾ĵ": 1537,
- "Ġvari": 1538,
- "con": 1539,
- "Ġmod": 1540,
- "ä»Ģä¹Ī": 1541,
- "Ġenergy": 1542,
- "æĬĢæľ¯": 1543,
- "ertain": 1544,
- "mm": 1545,
- "verall": 1546,
- "åĪĴ": 1547,
- "Ġrobots": 1548,
- "Ġorgan": 1549,
- "æİ¨": 1550,
- "ants": 1551,
- "åĩĨ": 1552,
- "ds": 1553,
- "æŀģ": 1554,
- "çĻ": 1555,
- "Ġrequ": 1556,
- "Ġess": 1557,
- "ç®Ģ": 1558,
- "ustain": 1559,
- "æ¨": 1560,
- "Ġstr": 1561,
- "cing": 1562,
- "ability": 1563,
- "ree": 1564,
- "Ġeduc": 1565,
- "åİĨ": 1566,
- "Ġcreate": 1567,
- "åģ¥åº·": 1568,
- "Ġdesign": 1569,
- "ips": 1570,
- "åģļ": 1571,
- "èĬ±": 1572,
- "ink": 1573,
- "èıľ": 1574,
- "æī¾": 1575,
- "段": 1576,
- "æµĭ": 1577,
- "ĠV": 1578,
- "ĠBy": 1579,
- "åĶ": 1580,
- "é¦ĸ": 1581,
- "è¯į": 1582,
- "Ġwhere": 1583,
- "Ġdisc": 1584,
- "äºĨè§£": 1585,
- "ric": 1586,
- "ä¸Ķ": 1587,
- "è¶³": 1588,
- "æĺ¯ä¸Ģ个": 1589,
- "arch": 1590,
- "积": 1591,
- "带": 1592,
- "Ġwhile": 1593,
- "Ġsignificant": 1594,
- "çłģ": 1595,
- "æĪ¿": 1596,
- "Ġbeing": 1597,
- "Ġlanguage": 1598,
- "itive": 1599,
- "20": 1600,
- "Ġanalyze": 1601,
- "æĻ¯": 1602,
- "èĮ": 1603,
- "rib": 1604,
- "模": 1605,
- "ĠSt": 1606,
- "è´¹": 1607,
- "'t": 1608,
- "Ġhealthcare": 1609,
- "Ġexperience": 1610,
- "Ġ5": 1611,
- "个人": 1612,
- "ays": 1613,
- "象": 1614,
- "plo": 1615,
- "Ġwould": 1616,
- "èĻij": 1617,
- "æĶ¶": 1618,
- "é¢Ħ": 1619,
- "é¢Ĩ": 1620,
- "ä¿ĿæĮģ": 1621,
- "ences": 1622,
- "åıª": 1623,
- "èĩ´": 1624,
- "æĪı": 1625,
- "Ġmental": 1626,
- "Ġfew": 1627,
- "ates": 1628,
- "è¿ĩç¨ĭ": 1629,
- "å®īåħ¨": 1630,
- "Ġsustain": 1631,
- "Ġwere": 1632,
- "太": 1633,
- "çĮ": 1634,
- "Ġspecific": 1635,
- "Ġworld": 1636,
- "çŃĶ": 1637,
- "```": 1638,
- "Ġtake": 1639,
- "åħ»": 1640,
- "éĢŁ": 1641,
- "ever": 1642,
- "SS": 1643,
- "éĶĢ": 1644,
- "Ġbo": 1645,
- "hes": 1646,
- "Ġmus": 1647,
- "æľįåĬ¡": 1648,
- "è§Ĵ": 1649,
- "ten": 1650,
- "æŀIJ": 1651,
- "pow": 1652,
- "dict": 1653,
- "vent": 1654,
- "10": 1655,
- "çļĦæĹ": 1656,
- "ĸçķ": 1657,
- "Ġprot": 1658,
- "ç½®": 1659,
- "Ġhigh": 1660,
- "Ġbus": 1661,
- "Ġindust": 1662,
- "åIJ¦": 1663,
- "cial": 1664,
- "人们": 1665,
- "ĠAs": 1666,
- "åijĬ": 1667,
- "ade": 1668,
- "æĶ¹": 1669,
- "çĹ": 1670,
- "Ġhad": 1671,
- "Ġher": 1672,
- "Ġjust": 1673,
- "ï¼Ľ": 1674,
- "è´Ń": 1675,
- "第": 1676,
- "éĵ": 1677,
- "Ġwater": 1678,
- "Ġfood": 1679,
- "éĺŁ": 1680,
- "aus": 1681,
- "Ġchalleng": 1682,
- "åħį": 1683,
- "æĸĩåĮĸ": 1684,
- "Ġmost": 1685,
- "é¸": 1686,
- "ç½ij": 1687,
- "缴": 1688,
- "Ġsm": 1689,
- "Ġactiv": 1690,
- "ploy": 1691,
- "Overall": 1692,
- "å¿«": 1693,
- "ruct": 1694,
- "Ġindividuals": 1695,
- "å§ĭ": 1696,
- "gies": 1697,
- "æŁ¥": 1698,
- "çα": 1699,
- "iety": 1700,
- "In": 1701,
- "åĪĨæŀIJ": 1702,
- "è§Ĩ": 1703,
- "温": 1704,
- "ç»´": 1705,
- "olut": 1706,
- "åŁŁ": 1707,
- "ommend": 1708,
- "Ġcomple": 1709,
- "æķĻ": 1710,
- "Ġbu": 1711,
- "Ġeducation": 1712,
- "ather": 1713,
- "Ġ4": 1714,
- "ting": 1715,
- "Ġfind": 1716,
- "没": 1717,
- "Ġhis": 1718,
- "ä¹ĭéĹ´": 1719,
- "Ġeffective": 1720,
- "Ġatt": 1721,
- "Ġrese": 1722,
- "èĥ½åĬĽ": 1723,
- "åŁİ": 1724,
- "Ġallow": 1725,
- "Ġav": 1726,
- "Ġpromot": 1727,
- "æĻºèĥ½": 1728,
- "满": 1729,
- "åħ±": 1730,
- "iew": 1731,
- "come": 1732,
- "ç³»ç»Ł": 1733,
- "Ġrespons": 1734,
- "äºĴ": 1735,
- "Ġcult": 1736,
- "powered": 1737,
- "Ġrecommend": 1738,
- "èIJ¥": 1739,
- "OSS": 1740,
- "Ġchange": 1741,
- "è¯ģ": 1742,
- "ved": 1743,
- "æİĴ": 1744,
- "è§£åĨ³": 1745,
- "ici": 1746,
- "ĠHow": 1747,
- "Ġfeel": 1748,
- "æľĪ": 1749,
- "Ġwhat": 1750,
- "以åıĬ": 1751,
- "Ġsee": 1752,
- "åŃ©": 1753,
- "bs": 1754,
- "Ġsur": 1755,
- "æ£": 1756,
- "ality": 1757,
- "Ġvis": 1758,
- "ç¡®ä¿Ŀ": 1759,
- "pect": 1760,
- "å®ŀçݰ": 1761,
- "Ġcare": 1762,
- "广": 1763,
- "ills": 1764,
- "åºŃ": 1765,
- "ases": 1766,
- "å¤į": 1767,
- "åºĶç͍": 1768,
- "çļĦæĥ": 1769,
- "ards": 1770,
- "Ġaddress": 1771,
- "Ġcompan": 1772,
- "Ġinvol": 1773,
- "Ġcustomer": 1774,
- "åĽłä¸º": 1775,
- "Ġstudents": 1776,
- "Ġins": 1777,
- "注æĦı": 1778,
- "æŀĦ": 1779,
- "欢": 1780,
- "æµ·": 1781,
- "åıĤ": 1782,
- "èĩªçĦ¶": 1783,
- "é©": 1784,
- "ĠThese": 1785,
- "wn": 1786,
- "æĺĵ": 1787,
- "çĬ¶": 1788,
- "ren": 1789,
- "Ġtreat": 1790,
- "Ġbenefits": 1791,
- "ĊĠĠĠĠĠĠĠ": 1792,
- "对äºİ": 1793,
- "æĢĿ": 1794,
- "ider": 1795,
- "ĠYes": 1796,
- "ĠK": 1797,
- "åĸľ": 1798,
- "Ġke": 1799,
- "Ġeng": 1800,
- "Ġpop": 1801,
- "ost": 1802,
- "pare": 1803,
- "Ġmon": 1804,
- "款": 1805,
- "ĠMOSS": 1806,
- "Ġemot": 1807,
- "Ġac": 1808,
- "ç¼ĸ": 1809,
- "fore": 1810,
- "åı¥": 1811,
- "Ġval": 1812,
- "ily": 1813,
- "Ġiss": 1814,
- "èĤī": 1815,
- "èĩ³": 1816,
- "游æĪı": 1817,
- "ween": 1818,
- "Ġinclude": 1819,
- "Ġprotect": 1820,
- "åħ³ç³»": 1821,
- "éĻ©": 1822,
- "Ġsever": 1823,
- "Ġthan": 1824,
- "éľĢæ±Ĥ": 1825,
- "ç»ĥ": 1826,
- "ĠThey": 1827,
- "iss": 1828,
- "ys": 1829,
- "Ġjob": 1830,
- "éĺ³": 1831,
- "æIJ": 1832,
- "Ġbetween": 1833,
- "Ġmach": 1834,
- "--------": 1835,
- "èĢĥèĻij": 1836,
- "è´¨éĩı": 1837,
- "Ġbusiness": 1838,
- "wor": 1839,
- "ick": 1840,
- "eg": 1841,
- "åħħ": 1842,
- "ç¯": 1843,
- "æĿ¡": 1844,
- "ner": 1845,
- "apt": 1846,
- "Ġappro": 1847,
- "Ġplay": 1848,
- "没æľī": 1849,
- "¤IJ": 1850,
- "æľª": 1851,
- "æĪĺ": 1852,
- "å®¶åºŃ": 1853,
- "ãĢĭ": 1854,
- "ency": 1855,
- "ĠCh": 1856,
- "ãĢĬ": 1857,
- "Ġproviding": 1858,
- "Ġresources": 1859,
- "âĢĻ": 1860,
- "Ġassist": 1861,
- "Ġnatural": 1862,
- "è¯Ħ": 1863,
- "便": 1864,
- "Ġsaf": 1865,
- "åħ·æľī": 1866,
- "è°¢": 1867,
- "çĥŃ": 1868,
- "ss": 1869,
- "eth": 1870,
- "old": 1871,
- "Ġperform": 1872,
- "Ġseveral": 1873,
- "é¤IJ": 1874,
- "Ġeach": 1875,
- "转": 1876,
- "ci": 1877,
- "Ġty": 1878,
- "Ġpub": 1879,
- "æ´»åĬ¨": 1880,
- "ocus": 1881,
- "çīĮ": 1882,
- "è¶Ĭ": 1883,
- "åĽ¢": 1884,
- "è½»": 1885,
- "è¯Ńè¨Ģ": 1886,
- "Ġareas": 1887,
- "éĩĩ": 1888,
- "ft": 1889,
- "riend": 1890,
- "å·²": 1891,
- "å¸Ĥåľº": 1892,
- "ition": 1893,
- "ients": 1894,
- "管çIJĨ": 1895,
- "许": 1896,
- "人类": 1897,
- "身ä½ĵ": 1898,
- "ique": 1899,
- "Ġpartic": 1900,
- "ç»Ń": 1901,
- "agement": 1902,
- "ves": 1903,
- "符": 1904,
- "line": 1905,
- "红": 1906,
- "åIJ¸": 1907,
- "Ġpatter": 1908,
- "000": 1909,
- "社ä¼ļ": 1910,
- "åĨħ容": 1911,
- "Ġorganiz": 1912,
- "ough": 1913,
- "Ġve": 1914,
- "åŃ©åŃIJ": 1915,
- "æĸ½": 1916,
- "æ¤į": 1917,
- "åĩł": 1918,
- "ä½Ĩæĺ¯": 1919,
- "Ġaff": 1920,
- "Ġnum": 1921,
- "lement": 1922,
- "èīº": 1923,
- "èij": 1924,
- "Ġcar": 1925,
- "ages": 1926,
- "abor": 1927,
- "æĺ¯ä¸Ģç§į": 1928,
- "Ġinst": 1929,
- "èĽ": 1930,
- "ä¹ĭä¸Ģ": 1931,
- "è·¯": 1932,
- "åį³": 1933,
- "Ġmain": 1934,
- "éļı": 1935,
- "How": 1936,
- "å¿ħ": 1937,
- "ç¨ĭåºı": 1938,
- "éŁ³ä¹IJ": 1939,
- "red": 1940,
- "æ²¹": 1941,
- "Ġoffer": 1942,
- "ets": 1943,
- "ç¢": 1944,
- "Ġduring": 1945,
- "çļĦ人": 1946,
- "æĽ´å¤ļ": 1947,
- "Ġdi": 1948,
- "代çłģ": 1949,
- "èİ·": 1950,
- "åħĭ": 1951,
- "Ġguid": 1952,
- "主è¦ģ": 1953,
- "Ġfam": 1954,
- "æİ§": 1955,
- "éĢļ常": 1956,
- "ĠAd": 1957,
- "å¤ĦçIJĨ": 1958,
- "urn": 1959,
- "ower": 1960,
- "åij½": 1961,
- "æıı": 1962,
- "Ġskills": 1963,
- "Ġtool": 1964,
- "ware": 1965,
- "æĸĩæľ¬": 1966,
- "Ġpatterns": 1967,
- "缮æłĩ": 1968,
- "acy": 1969,
- "æīĵ": 1970,
- "åŁİå¸Ĥ": 1971,
- "Ġevery": 1972,
- "ries": 1973,
- "读": 1974,
- "éģ¿": 1975,
- "çϽ": 1976,
- "éĢĤåIJĪ": 1977,
- "Ġpatient": 1978,
- "羣": 1979,
- "oth": 1980,
- "她": 1981,
- "åĶ®": 1982,
- "ä¸Ģç§į": 1983,
- "Ġmade": 1984,
- "ä½İ": 1985,
- "ise": 1986,
- "Ġrem": 1987,
- "æ¶Ī": 1988,
- "åIJ«": 1989,
- "air": 1990,
- "Ġgener": 1991,
- "oy": 1992,
- "ç²¾": 1993,
- "æĥħåĨµ": 1994,
- "ights": 1995,
- "Ġexpl": 1996,
- "è§ģ": 1997,
- "Ġpredict": 1998,
- "ç±³": 1999,
- "æĽ´å¥½": 2000,
- "ä¿®": 2001,
- "Ġclimate": 2002,
- "Ġfocus": 2003,
- "Ġgrow": 2004,
- "客æĪ·": 2005,
- "ä¸įæĸŃ": 2006,
- "itor": 2007,
- "ĠEn": 2008,
- "约": 2009,
- "æĺ¯åIJ¦": 2010,
- "ä»ħ": 2011,
- "æĪij们çļĦ": 2012,
- "æľĽ": 2013,
- "op": 2014,
- "Ġmaking": 2015,
- "yth": 2016,
- "ccess": 2017,
- "Ġown": 2018,
- "ggest": 2019,
- "Ġtas": 2020,
- "uture": 2021,
- "Ġmodel": 2022,
- "put": 2023,
- "Ġresearch": 2024,
- "erest": 2025,
- "éļ¾": 2026,
- "Ġ[": 2027,
- "iel": 2028,
- "ational": 2029,
- "Ġcommunic": 2030,
- "ç¥ŀ": 2031,
- "ç©¶": 2032,
- "Ġrest": 2033,
- "æĪIJ为": 2034,
- "king": 2035,
- "pr": 2036,
- "åĮ»": 2037,
- "cur": 2038,
- "èĤ²": 2039,
- "Ġ'": 2040,
- "è¿Ļç§į": 2041,
- "ç¯ĩ": 2042,
- "Ġche": 2043,
- "own": 2044,
- "éĻħ": 2045,
- "Ġfin": 2046,
- "åĪ¶ä½ľ": 2047,
- "Ġsuggest": 2048,
- "å¢ŀåĬł": 2049,
- "Ġmedia": 2050,
- "ribut": 2051,
- "çļĦæĥħ": 2052,
- "åĬłåħ¥": 2053,
- "Ġcle": 2054,
- "åij¨": 2055,
- "竳": 2056,
- "Ġthink": 2057,
- "Ġlocal": 2058,
- "pportun": 2059,
- "ĠYou": 2060,
- "Ġplan": 2061,
- "Ġeven": 2062,
- "éĽĨ": 2063,
- "å·§": 2064,
- "ax": 2065,
- "Ġchallenges": 2066,
- "Ġprof": 2067,
- "ĠCan": 2068,
- "Ġconcer": 2069,
- "Ġfuture": 2070,
- "åĬ¿": 2071,
- "Ġref": 2072,
- "èģĶ": 2073,
- "Ġself": 2074,
- "æĪĸèĢħ": 2075,
- "ble": 2076,
- "åĽ´": 2077,
- "è¿IJåĬ¨": 2078,
- "Ġinf": 2079,
- "éĩĬ": 2080,
- "Ġsustainable": 2081,
- "Ġtext": 2082,
- "Ġgra": 2083,
- "äºĮ": 2084,
- "åĵģçīĮ": 2085,
- "ä¸įåIJĮçļĦ": 2086,
- "led": 2087,
- "çĭ¬": 2088,
- "Ġopportun": 2089,
- "Ġcontin": 2090,
- "ym": 2091,
- "Ġget": 2092,
- "å¯Ĩ": 2093,
- "éϤ": 2094,
- "æħ": 2095,
- "éģ¿åħį": 2096,
- "Ġ+": 2097,
- "è§ī": 2098,
- "Ġret": 2099,
- "å¸ĥ": 2100,
- "Ġinterest": 2101,
- "Ġsociety": 2102,
- "ç»ĵæŀľ": 2103,
- "åIJ¬": 2104,
- "é¦ĸåħĪ": 2105,
- "Ġbre": 2106,
- "Ġ20": 2107,
- "ĠHowever": 2108,
- "è®°": 2109,
- "ons": 2110,
- "è¿ij": 2111,
- "å¼Ģå§ĭ": 2112,
- "Ġbuild": 2113,
- "Ġbeh": 2114,
- "'m": 2115,
- "vers": 2116,
- "Ġgood": 2117,
- "çIJĨè§£": 2118,
- "resent": 2119,
- "离": 2120,
- "åĬŁèĥ½": 2121,
- "Ġeffort": 2122,
- "labor": 2123,
- "é»ij": 2124,
- "Ġbetter": 2125,
- "Ġread": 2126,
- "å¾ĭ": 2127,
- "èĽĭ": 2128,
- "hed": 2129,
- "ä¹°": 2130,
- "导èĩ´": 2131,
- "Ġimplement": 2132,
- "ç¿": 2133,
- "享": 2134,
- "头": 2135,
- "ense": 2136,
- "Ġlong": 2137,
- "other": 2138,
- "饮": 2139,
- "åŃĺåľ¨": 2140,
- "çļĦæĦ": 2141,
- "ä¸Ģ份": 2142,
- "ython": 2143,
- "ning": 2144,
- "åĩıå°ij": 2145,
- "åĢĻ": 2146,
- "ä¸ĵ": 2147,
- "åIJĦç§į": 2148,
- "èħ": 2149,
- "å°½": 2150,
- "åįĩ": 2151,
- "æĬ¥": 2152,
- "Ġpublic": 2153,
- "Ġlar": 2154,
- "ä½łçļĦ": 2155,
- "aut": 2156,
- "é¢ĨåŁŁ": 2157,
- "æļ": 2158,
- "ollow": 2159,
- "èģĮ": 2160,
- "Ġchang": 2161,
- "Ġbest": 2162,
- "hip": 2163,
- "åĨį": 2164,
- "akes": 2165,
- "Ġchat": 2166,
- "ited": 2167,
- "Ġpower": 2168,
- "ä¿ĿæĬ¤": 2169,
- "书": 2170,
- "计åĪĴ": 2171,
- "éĩįè¦ģçļĦ": 2172,
- "åıĺåĮĸ": 2173,
- "ilities": 2174,
- "Ġconsider": 2175,
- "æĪij们åı¯ä»¥": 2176,
- "éĤ£ä¹Ī": 2177,
- "Ġide": 2178,
- "æ¼Ķ": 2179,
- "aging": 2180,
- "Ġbased": 2181,
- "å®Ŀ": 2182,
- "Ġrange": 2183,
- "Ġresult": 2184,
- "Ġmem": 2185,
- "çħ§": 2186,
- "Ġlevel": 2187,
- "cou": 2188,
- "Ġbr": 2189,
- "Th": 2190,
- "ä¼ģ": 2191,
- "建ç«ĭ": 2192,
- "Ġunique": 2193,
- "è®Ń": 2194,
- "Ġmark": 2195,
- "许å¤ļ": 2196,
- "è¡Į为": 2197,
- "Ķç©¶": 2198,
- "çļĦæĬ": 2199,
- "Ġset": 2200,
- "骤": 2201,
- "ts": 2202,
- "Ġhist": 2203,
- "Ġaround": 2204,
- "Ġrev": 2205,
- "åħ¶ä¸Ń": 2206,
- "ï¼ģ": 2207,
- "æııè¿°": 2208,
- "æľĢåIJİ": 2209,
- "Ġsim": 2210,
- "nect": 2211,
- "åĽŀçŃĶ": 2212,
- "éĺ²": 2213,
- "èī¯": 2214,
- "åΰäºĨ": 2215,
- "ä¸ĸçķ": 2216,
- "æĸ¹æ¡Ī": 2217,
- "æĿIJæĸĻ": 2218,
- "ä¸ĸçķĮ": 2219,
- "æĽ´å¥½åľ°": 2220,
- "两个": 2221,
- "Ġemploy": 2222,
- "Ġtry": 2223,
- "æĵ": 2224,
- "Ġback": 2225,
- "åĪĩ": 2226,
- "Ġsuccess": 2227,
- "Ġdecisions": 2228,
- "Ġthose": 2229,
- "å¯Į": 2230,
- "Ġfact": 2231,
- "æİ¢": 2232,
- "è¶£": 2233,
- "Ġpractices": 2234,
- "åIJĹ": 2235,
- "æīį": 2236,
- "çİ©": 2237,
- "ption": 2238,
- "æĸĩ竳": 2239,
- "Ġfeat": 2240,
- "Ġprevent": 2241,
- "Ġwriting": 2242,
- "çļĦæĢ": 2243,
- "Ġno": 2244,
- "ä»ĭ": 2245,
- "éŨ": 2246,
- "Ġdel": 2247,
- "æĴ": 2248,
- "Ġoptim": 2249,
- "ination": 2250,
- "ĠĊ": 2251,
- "usion": 2252,
- "Ġaccount": 2253,
- "ling": 2254,
- "Ġdivers": 2255,
- ".\"": 2256,
- "ath": 2257,
- "èĭ±": 2258,
- "ä¼ģä¸ļ": 2259,
- "Ġgrou": 2260,
- "åľ°çIJĥ": 2261,
- "失": 2262,
- "Ġpersonalized": 2263,
- "ĠHe": 2264,
- "表达": 2265,
- "curity": 2266,
- "Ġfollow": 2267,
- "产çĶŁ": 2268,
- "Ġear": 2269,
- "åİĭ": 2270,
- "vern": 2271,
- "Ġissues": 2272,
- "åĿĩ": 2273,
- "é²": 2274,
- "Ġdr": 2275,
- "iving": 2276,
- "Ġtraining": 2277,
- "Ġrisk": 2278,
- "åĩ½": 2279,
- "åı²": 2280,
- "æij": 2281,
- "çļĦæĹ¶": 2282,
- "ogn": 2283,
- "Ġrequire": 2284,
- "Ġenvironmental": 2285,
- "back": 2286,
- "éĶ®": 2287,
- "çĸĹ": 2288,
- "Ġinteract": 2289,
- "åĽ¢éĺŁ": 2290,
- "æ¯ı个": 2291,
- "çĦ¶åIJİ": 2292,
- "Ġdist": 2293,
- "ç͍äºİ": 2294,
- "认为": 2295,
- "åĩ½æķ°": 2296,
- "Ġsent": 2297,
- "ĊĠĠĠĠĠĠĠĠ": 2298,
- "Ġreducing": 2299,
- "å¹²": 2300,
- "Ġrep": 2301,
- "Ġcaus": 2302,
- "Ġmusic": 2303,
- "çª": 2304,
- "Ġmonitor": 2305,
- "Ġform": 2306,
- "é¢ľ": 2307,
- "çĹħ": 2308,
- "é¦Ļ": 2309,
- "Ġoften": 2310,
- "åı¯èĥ½ä¼ļ": 2311,
- "åijĺå·¥": 2312,
- "Ġhand": 2313,
- "æĬķ": 2314,
- "Ġneeds": 2315,
- "æŃ¤å¤ĸ": 2316,
- "åıĭ": 2317,
- "ivity": 2318,
- "Ġactivities": 2319,
- "åĸľæ¬¢": 2320,
- "Ġpur": 2321,
- "ian": 2322,
- "self": 2323,
- "åĬ¨çī©": 2324,
- "comes": 2325,
- "å©": 2326,
- "Ġpriv": 2327,
- "az": 2328,
- "Ġrelations": 2329,
- "Ġmachine": 2330,
- "çļĦæ°": 2331,
- "ä»·æł¼": 2332,
- "ä»·å̼": 2333,
- "ç´¢": 2334,
- "Ġfeed": 2335,
- "ä¸Ģä¸ĭ": 2336,
- "Ġteam": 2337,
- "Ġindustry": 2338,
- "è´¢": 2339,
- "ĠPro": 2340,
- "Ġwant": 2341,
- "ç§°": 2342,
- "Ġclass": 2343,
- "Ġlove": 2344,
- "åħ³äºİ": 2345,
- "è¾ĵåħ¥": 2346,
- "Ġtransport": 2347,
- "Ġcomplex": 2348,
- "Ġyear": 2349,
- "éĶĢåĶ®": 2350,
- "寻": 2351,
- "ience": 2352,
- "ists": 2353,
- "æĶ¯æĮģ": 2354,
- "Ġmind": 2355,
- "Ġfun": 2356,
- "Ġchar": 2357,
- "æĮī": 2358,
- "Ġconcerns": 2359,
- "conom": 2360,
- "ç®Ģåįķ": 2361,
- "以ä¸ĭæĺ¯": 2362,
- "Ġstart": 2363,
- "å¹¶ä¸Ķ": 2364,
- "avi": 2365,
- "ä¸ŃåĽ½": 2366,
- "åħĥç´ł": 2367,
- "Ġconf": 2368,
- "Ġpositive": 2369,
- "Ġcur": 2370,
- "Ġcount": 2371,
- "ery": 2372,
- "å¡": 2373,
- "室": 2374,
- "Ġcost": 2375,
- "Ġequ": 2376,
- "Ġpolic": 2377,
- "aste": 2378,
- "aw": 2379,
- "éħĴ": 2380,
- "coura": 2381,
- "iven": 2382,
- "place": 2383,
- "chie": 2384,
- "çļĦæķ°": 2385,
- "åĽłç´ł": 2386,
- "Ġfl": 2387,
- "ism": 2388,
- "Ġmedical": 2389,
- "Ġhumans": 2390,
- "Ġautom": 2391,
- "ertainly": 2392,
- "Ġ0": 2393,
- "Ġoffers": 2394,
- "Ġdetect": 2395,
- "Ġ6": 2396,
- "é£İæł¼": 2397,
- "Ġshow": 2398,
- "çģ«": 2399,
- "Ġanim": 2400,
- "é¢ľèī²": 2401,
- "lease": 2402,
- "ave": 2403,
- "åĵª": 2404,
- "ĠThere": 2405,
- "以ä¸Ĭ": 2406,
- "æľªæĿ¥": 2407,
- "XX": 2408,
- "çīĩ": 2409,
- "uch": 2410,
- "Ġtasks": 2411,
- "åħ·ä½ĵ": 2412,
- "æ¤įçī©": 2413,
- "Ġmin": 2414,
- "èīºæľ¯": 2415,
- "icult": 2416,
- "Ġexperiences": 2417,
- "æİ§åζ": 2418,
- "be": 2419,
- "Ġpatients": 2420,
- "å²": 2421,
- "ĠWe": 2422,
- "Ġrecogn": 2423,
- "çĥ¤": 2424,
- "Ġsmall": 2425,
- "åĿĹ": 2426,
- "åĦ": 2427,
- "太éĺ³": 2428,
- "ction": 2429,
- "Ġent": 2430,
- "æį¢": 2431,
- "Ġbefore": 2432,
- "Ġbecome": 2433,
- "å·²ç»ı": 2434,
- "表çݰ": 2435,
- "Ġexplo": 2436,
- "Ġachie": 2437,
- "ä»»åĬ¡": 2438,
- "大çļĦ": 2439,
- "Ġday": 2440,
- "Ġfound": 2441,
- "å±±": 2442,
- "ond": 2443,
- "Ġtreatment": 2444,
- "pend": 2445,
- "hen": 2446,
- "Ġcondit": 2447,
- "ç¡®å®ļ": 2448,
- "Ġbusinesses": 2449,
- "ĠWh": 2450,
- "æīĢæľī": 2451,
- "Ġdeveloped": 2452,
- "ç»Ī": 2453,
- "æŃ¥éª¤": 2454,
- "Ġdifficult": 2455,
- "åı·": 2456,
- "ĠRe": 2457,
- "éĶĻ": 2458,
- "Ġcho": 2459,
- "Ġquest": 2460,
- "Ġtranspare": 2461,
- "Ġproject": 2462,
- "Ġcommunity": 2463,
- "ov": 2464,
- "å¸Ī": 2465,
- "å¼ł": 2466,
- "åĪĨç±»": 2467,
- "人çļĦ": 2468,
- "sis": 2469,
- "çĽĬ": 2470,
- "oid": 2471,
- "ĠAn": 2472,
- "ways": 2473,
- "Ġeas": 2474,
- "Ġaffect": 2475,
- "Ġothers": 2476,
- "Ġregul": 2477,
- "æĢ§åĴĮ": 2478,
- "åĸĦ": 2479,
- "agn": 2480,
- "ä½ľä¸º": 2481,
- "åı¯ä»¥å¸®åĬ©": 2482,
- "åĦ¿": 2483,
- "Ġorganizations": 2484,
- "鸡": 2485,
- "åħ´": 2486,
- "Ġfriend": 2487,
- "Ġ$": 2488,
- "Ġdetail": 2489,
- "Ġtraditional": 2490,
- "Ġdesigned": 2491,
- "è´Ńä¹°": 2492,
- "ä½ĵéªĮ": 2493,
- "ç»į": 2494,
- "erm": 2495,
- "Ġconnect": 2496,
- "è¿Ļæł·": 2497,
- "Ġrecommendations": 2498,
- "Ġboth": 2499,
- "ŁéĢļ": 2500,
- "æ¯į": 2501,
- "Ġsit": 2502,
- "ä½ľç͍": 2503,
- "ä»ĭç»į": 2504,
- "Ġste": 2505,
- "ĠSure": 2506,
- "åı°": 2507,
- "æĤ¨çļĦ": 2508,
- "Ġshe": 2509,
- "Ġmanagement": 2510,
- "joy": 2511,
- "è´Ł": 2512,
- "Ġpromote": 2513,
- "Ġvarious": 2514,
- "(\"": 2515,
- "por": 2516,
- "Ġsens": 2517,
- "Ġessential": 2518,
- "gether": 2519,
- "ularly": 2520,
- "äºī": 2521,
- "irst": 2522,
- "Ġop": 2523,
- "Ġspecies": 2524,
- "çİ°åľ¨": 2525,
- "cho": 2526,
- "Ġbehavi": 2527,
- "çŃij": 2528,
- "女": 2529,
- "Ġquality": 2530,
- "Ġext": 2531,
- "è¥": 2532,
- "å®ĮæĪIJ": 2533,
- "æĢ»ä¹ĭ": 2534,
- "éĥ¨åĪĨ": 2535,
- "ä»İèĢĮ": 2536,
- "åĽ¾": 2537,
- "Ġtyp": 2538,
- "Ġstrate": 2539,
- "西": 2540,
- "Ġhere": 2541,
- "ars": 2542,
- "å¸Į": 2543,
- "çļĦæĿ": 2544,
- "å°Ŀ": 2545,
- "ee": 2546,
- "ier": 2547,
- "Ġec": 2548,
- "ically": 2549,
- "ering": 2550,
- "念": 2551,
- "ĠDe": 2552,
- "Ġneg": 2553,
- "建çŃij": 2554,
- "Ġservices": 2555,
- "Ġable": 2556,
- "imes": 2557,
- "Ġoptions": 2558,
- "缸åħ³": 2559,
- "Ġsub": 2560,
- "Ġdecision": 2561,
- "ĠCertainly": 2562,
- "Ġåľ¨": 2563,
- "æ¢": 2564,
- "Ġservice": 2565,
- "):": 2566,
- "带æĿ¥": 2567,
- "Ġchild": 2568,
- "è§£éĩĬ": 2569,
- "irt": 2570,
- "çĨ": 2571,
- "ä¸įä»ħ": 2572,
- "æĿ¾": 2573,
- "积æŀģ": 2574,
- "ron": 2575,
- "åı¤": 2576,
- "çłĶç©¶": 2577,
- "ç²ī": 2578,
- "hor": 2579,
- "Ġprofess": 2580,
- "çļĦéĹ®é¢ĺ": 2581,
- "Ġopportunities": 2582,
- "åİĨåı²": 2583,
- "Ġdef": 2584,
- "ĠAm": 2585,
- "Ġgr": 2586,
- "aur": 2587,
- "å±Ĥ": 2588,
- "çŃĸ": 2589,
- "Ġpopular": 2590,
- "æ´ģ": 2591,
- "åıijçݰ": 2592,
- "Ġpoem": 2593,
- "èµĽ": 2594,
- "Ġob": 2595,
- "Ġdon": 2596,
- "Ġsound": 2597,
- "Ġtransportation": 2598,
- "ious": 2599,
- "åı¦": 2600,
- "Ġrole": 2601,
- "Ġfiel": 2602,
- "ç§ijåѦ": 2603,
- "èĢģ": 2604,
- "reen": 2605,
- "æľīæķĪ": 2606,
- "Ġcor": 2607,
- "Ġfeedback": 2608,
- "Ġtechnologies": 2609,
- "交éĢļ": 2610,
- "Ġadapt": 2611,
- "'re": 2612,
- "ervation": 2613,
- "Ġcommunities": 2614,
- "çݰ代": 2615,
- "Ġlook": 2616,
- "Ġfac": 2617,
- "ç͵影": 2618,
- "Ġcollect": 2619,
- "å¾Ĺåΰ": 2620,
- "hips": 2621,
- "Ġavail": 2622,
- "eren": 2623,
- "ä¸Ģèµ·": 2624,
- "çīĽ": 2625,
- "Ġposs": 2626,
- "Ġweather": 2627,
- "Ġefforts": 2628,
- "¿Ģ": 2629,
- "æĹħ": 2630,
- "oh": 2631,
- "Ġcollabor": 2632,
- "æĭ¥": 2633,
- "æĪIJåĬŁ": 2634,
- "èİ·å¾Ĺ": 2635,
- "å±ħ": 2636,
- "Ġtre": 2637,
- "Ġsources": 2638,
- "Ġstudy": 2639,
- "Ġprograms": 2640,
- "éĻIJ": 2641,
- "Ġtips": 2642,
- "Ġmarket": 2643,
- "ally": 2644,
- "害": 2645,
- "wards": 2646,
- "æ£Ģ": 2647,
- "ä¸Ģç¯ĩ": 2648,
- "rior": 2649,
- "Ġtop": 2650,
- "Ġend": 2651,
- "åĭ": 2652,
- "Ġlarge": 2653,
- "iciency": 2654,
- "Ġdec": 2655,
- "å®ļçļĦ": 2656,
- "icient": 2657,
- "è¿ĩç¨ĭä¸Ń": 2658,
- "lications": 2659,
- "缺": 2660,
- "Ġtour": 2661,
- "Ġtogether": 2662,
- "人工": 2663,
- "Ġtools": 2664,
- "æĸ¯": 2665,
- "æ°ij": 2666,
- "æĬĬ": 2667,
- "ä¹ĭéĹ´çļĦ": 2668,
- "çī¹çĤ¹": 2669,
- "Ġbel": 2670,
- "ditionally": 2671,
- "åĪ©ç͍": 2672,
- "è¾¹": 2673,
- "éĻį": 2674,
- "ĠIf": 2675,
- "é¢Ŀ": 2676,
- "åįı": 2677,
- "å¾Ģ": 2678,
- "lish": 2679,
- "è¯ī": 2680,
- "ins": 2681,
- "奶": 2682,
- "Ġeconom": 2683,
- "Ġinvest": 2684,
- "ĠDo": 2685,
- "tain": 2686,
- "åĩºçݰ": 2687,
- "çļĦå½±åĵį": 2688,
- "aterial": 2689,
- "Ġsure": 2690,
- "Ġpass": 2691,
- "çĶ»": 2692,
- "è´£": 2693,
- "ç»ĵæŀĦ": 2694,
- "æķħ": 2695,
- "æĥħæĦŁ": 2696,
- "æ¿Ģ": 2697,
- "ellig": 2698,
- "ä¼Ĺ": 2699,
- "æ¯Ķè¾ĥ": 2700,
- "tern": 2701,
- "Ġoutcomes": 2702,
- "up": 2703,
- "Ġbeaut": 2704,
- "read": 2705,
- "çĶŁæĪIJ": 2706,
- "æķ°åŃĹ": 2707,
- "Ġdem": 2708,
- "ires": 2709,
- "åı¯ä»¥éĢļè¿ĩ": 2710,
- "æĸ°çļĦ": 2711,
- "Ġdeep": 2712,
- "å¨": 2713,
- "çĭĹ": 2714,
- "åħ³æ³¨": 2715,
- "çĶŁåij½": 2716,
- "ä¼łç»Ł": 2717,
- "Ġstay": 2718,
- "æŃĮ": 2719,
- "åħ³éĶ®": 2720,
- "Ġplace": 2721,
- "主é¢ĺ": 2722,
- "å¾Īå¤ļ": 2723,
- "èĪĴ": 2724,
- "Ġprofessional": 2725,
- "yle": 2726,
- "æĽ²": 2727,
- "19": 2728,
- "Ġessay": 2729,
- "Ġgive": 2730,
- "ç³ĸ": 2731,
- "Ġonly": 2732,
- "æŁIJ": 2733,
- "Ġphys": 2734,
- "对è¯Ŀ": 2735,
- "Ġcontro": 2736,
- "Ġamount": 2737,
- "cept": 2738,
- "ization": 2739,
- "ç¼ĸåĨĻ": 2740,
- "åıĹåΰ": 2741,
- "Ġalways": 2742,
- "æ¯Ķå¦Ĥ": 2743,
- "Ġprivacy": 2744,
- "au": 2745,
- "________": 2746,
- "Ġresponsible": 2747,
- "()": 2748,
- "çŃīçŃī": 2749,
- "Ġmaterial": 2750,
- "Ġonline": 2751,
- "é¼": 2752,
- "æĶ¿": 2753,
- "åĽĽ": 2754,
- "Ġenjoy": 2755,
- "åľŁ": 2756,
- "Ġsafety": 2757,
- "Ġtw": 2758,
- "Ġcommunication": 2759,
- "丽": 2760,
- "æĺ¾": 2761,
- "olution": 2762,
- "erg": 2763,
- "įä½ľ": 2764,
- "Ġuser": 2765,
- "Ġemotional": 2766,
- "time": 2767,
- "é¾": 2768,
- "Ġsecurity": 2769,
- "Ġsense": 2770,
- "elines": 2771,
- "åĬ±": 2772,
- "çī©è´¨": 2773,
- "ura": 2774,
- "Ġshare": 2775,
- "Ġanalyzing": 2776,
- "ital": 2777,
- "é±": 2778,
- "irtual": 2779,
- "Ġvisit": 2780,
- "bers": 2781,
- "Ġcour": 2782,
- "Ġproble": 2783,
- "设å¤ĩ": 2784,
- "atch": 2785,
- "land": 2786,
- "é±¼": 2787,
- "æĪij们éľĢè¦ģ": 2788,
- "稳": 2789,
- "ibility": 2790,
- "Ġefficiency": 2791,
- "声": 2792,
- "èĴ": 2793,
- "æľºåύ": 2794,
- "Ġclear": 2795,
- "åζå®ļ": 2796,
- "izing": 2797,
- "Ġconditions": 2798,
- "lusion": 2799,
- "Ġlow": 2800,
- "Ġlim": 2801,
- "hers": 2802,
- "Ġrisks": 2803,
- "ç¿»": 2804,
- "Ġlet": 2805,
- "åĴĸ": 2806,
- "å¿ĥçIJĨ": 2807,
- "è¿ľ": 2808,
- "print": 2809,
- "Ġchanges": 2810,
- "Ġmeas": 2811,
- "Ġimproving": 2812,
- "Ġcrit": 2813,
- "50": 2814,
- "å¸ĮæľĽ": 2815,
- "Ġaud": 2816,
- "åįĹ": 2817,
- "æĹłæ³ķ": 2818,
- "Ġnegative": 2819,
- "é¡¹çĽ®": 2820,
- "und": 2821,
- "ats": 2822,
- "Ġcompanies": 2823,
- "æī¾åΰ": 2824,
- "Ġcontribut": 2825,
- "æŃ£ç¡®": 2826,
- "é»Ħ": 2827,
- "å±ŀ": 2828,
- "Ġunderstanding": 2829,
- "Ġmult": 2830,
- "Ġclo": 2831,
- "å¾ģ": 2832,
- "Ġprior": 2833,
- "rim": 2834,
- "人工æĻºèĥ½": 2835,
- "Ġvariety": 2836,
- "Ġtaking": 2837,
- "åĤ": 2838,
- "aster": 2839,
- "ody": 2840,
- "Ġ{": 2841,
- "çļĦéĩįè¦ģ": 2842,
- "Ġfore": 2843,
- "èµĦæºIJ": 2844,
- "è¦ģæ±Ĥ": 2845,
- "Ġfeatures": 2846,
- "èįī": 2847,
- "me": 2848,
- "èĮĥ": 2849,
- "Ġoper": 2850,
- "级": 2851,
- "é²ľ": 2852,
- "æĬĢå·§": 2853,
- "ijæĪĺ": 2854,
- "ç±»åŀĭ": 2855,
- "æĿ¿": 2856,
- "软": 2857,
- "ew": 2858,
- "Ġrestaur": 2859,
- "Ġwithout": 2860,
- "ructure": 2861,
- "çļĦæĺ¯": 2862,
- "çı": 2863,
- "Ġlist": 2864,
- "urate": 2865,
- "Ġbook": 2866,
- "亲": 2867,
- "åºĹ": 2868,
- "ä¹Łæĺ¯": 2869,
- "ä»»ä½ķ": 2870,
- "Ġcam": 2871,
- "ĠBe": 2872,
- "Ġgovern": 2873,
- "Ġbehavior": 2874,
- "è®Ńç»ĥ": 2875,
- "Ġfamily": 2876,
- "æĿĤ": 2877,
- "Ġcity": 2878,
- "Ġapproach": 2879,
- "Ġaccurate": 2880,
- "Ġsom": 2881,
- "Ġel": 2882,
- "èĪŀ": 2883,
- "èŀ": 2884,
- "åŁºæľ¬": 2885,
- "Ġdise": 2886,
- "Ġencoura": 2887,
- "ĠWhat": 2888,
- "åĥ": 2889,
- "详": 2890,
- "¦Ĥ": 2891,
- "å·¥åħ·": 2892,
- "åķ¡": 2893,
- "Ġstill": 2894,
- "chool": 2895,
- "æĦŁåΰ": 2896,
- "çĶŁçī©": 2897,
- "åĴĸåķ¡": 2898,
- "åĩĨå¤ĩ": 2899,
- "Ġwaste": 2900,
- "Ġevents": 2901,
- "æķĻèĤ²": 2902,
- "Ġ8": 2903,
- "Ġmust": 2904,
- "ied": 2905,
- "asing": 2906,
- "å½¢æĪIJ": 2907,
- "Ġproducts": 2908,
- "åħ¸": 2909,
- "讲": 2910,
- "fter": 2911,
- "å·®": 2912,
- "less": 2913,
- "Ġcro": 2914,
- "Ġfinan": 2915,
- "åıįåºĶ": 2916,
- "åĪĽéĢł": 2917,
- "Ġguidelines": 2918,
- "åΤ": 2919,
- "ä½ľåĵģ": 2920,
- "表示": 2921,
- "å¼Ĥ": 2922,
- "Ġknown": 2923,
- "Ġtest": 2924,
- "误": 2925,
- "ope": 2926,
- "Ġusers": 2927,
- "AI": 2928,
- "å¾·": 2929,
- "new": 2930,
- "追": 2931,
- "iques": 2932,
- "模åŀĭ": 2933,
- "åĬĽåĴĮ": 2934,
- "Ġhistory": 2935,
- "ĠAl": 2936,
- "æĬķèµĦ": 2937,
- "å°Ŀè¯ķ": 2938,
- "ank": 2939,
- "Ġhome": 2940,
- "éĴŁ": 2941,
- "丰": 2942,
- "èĪĴéĢĤ": 2943,
- "Ġincrease": 2944,
- "Ġhab": 2945,
- "åĪ»": 2946,
- "è¾ĵåĩº": 2947,
- "Ġleading": 2948,
- "Ġ7": 2949,
- "é£İéĻ©": 2950,
- "Ġperformance": 2951,
- "Ġhapp": 2952,
- "åŃ£": 2953,
- "Ġstand": 2954,
- "ty": 2955,
- "ç¦ı": 2956,
- "Ġcustomers": 2957,
- "åįİ": 2958,
- "Ġbelie": 2959,
- "Ġcompany": 2960,
- "å½ķ": 2961,
- "é£Łçī©": 2962,
- "ĠUn": 2963,
- "Ġsumm": 2964,
- "rent": 2965,
- "ĠCon": 2966,
- "éĢĤéĩı": 2967,
- "anced": 2968,
- "Ġi": 2969,
- "Ġlight": 2970,
- "Ġanalysis": 2971,
- "å°Ĭ": 2972,
- "ĠUse": 2973,
- "ouse": 2974,
- "ted": 2975,
- "Ġcharact": 2976,
- "Ġ#": 2977,
- "to": 2978,
- "绾": 2979,
- "ä¸įæĺ¯": 2980,
- "Ġdeveloping": 2981,
- "åŁ¹": 2982,
- "Ġstrategies": 2983,
- "Ġmight": 2984,
- "çŁŃ": 2985,
- "çļĦæİ": 2986,
- "Ġfirst": 2987,
- "èĥĮ": 2988,
- "çĮ«": 2989,
- "Ġincludes": 2990,
- "åĽŃ": 2991,
- "Ġdiagn": 2992,
- "Ġgrowth": 2993,
- "ä¸ĵä¸ļ": 2994,
- "Ġdoes": 2995,
- "12": 2996,
- "绿": 2997,
- "Ġkeep": 2998,
- "详ç»Ĩ": 2999,
- "åĥı": 3000,
- "åıijçĶŁ": 3001,
- "fact": 3002,
- "åı¯ä»¥åľ¨": 3003,
- "ç«Ļ": 3004,
- "æĭī": 3005,
- "æµİ": 3006,
- "Ġchatbots": 3007,
- "Ġbreak": 3008,
- "è¡¡": 3009,
- "çŁ³": 3010,
- "æĮģç»Ń": 3011,
- "life": 3012,
- "Ġ10": 3013,
- "æ´Ĺ": 3014,
- "ĠAdditionally": 3015,
- "士": 3016,
- "ember": 3017,
- "Ġgoals": 3018,
- "å¾®": 3019,
- "Ġview": 3020,
- "·": 3021,
- "ove": 3022,
- "åŁºç¡": 3023,
- "Ġoptimize": 3024,
- "Ġtem": 3025,
- "Ġdown": 3026,
- "åŁºç¡Ģ": 3027,
- "è¶ħ": 3028,
- "ercis": 3029,
- "Ġless": 3030,
- "ees": 3031,
- "æĿĥ": 3032,
- "Ġkey": 3033,
- "Ġworks": 3034,
- "讨": 3035,
- "åı¥åŃIJ": 3036,
- "Ġrobot": 3037,
- "uss": 3038,
- "åħ¨çIJĥ": 3039,
- "ç»ıæµİ": 3040,
- "æīįèĥ½": 3041,
- "egr": 3042,
- "ä»ĸ们çļĦ": 3043,
- "äºĶ": 3044,
- "èµ·æĿ¥": 3045,
- "çĵ": 3046,
- "Ġfactors": 3047,
- "Ġcultural": 3048,
- "æľ¨": 3049,
- "Ġworking": 3050,
- "ä¼¼": 3051,
- "èIJ½": 3052,
- "éĢŁåº¦": 3053,
- "ä½ı": 3054,
- "Ġeffects": 3055,
- "å©ļ": 3056,
- "br": 3057,
- "åİħ": 3058,
- "rain": 3059,
- "\")": 3060,
- "åѦçĶŁ": 3061,
- "\",": 3062,
- "Ġpar": 3063,
- "atform": 3064,
- "Ġensuring": 3065,
- "çͱäºİ": 3066,
- "Ġmuch": 3067,
- "Ġwords": 3068,
- "Ġmar": 3069,
- "ç»ıéªĮ": 3070,
- "为äºĨ": 3071,
- "åIJĪä½ľ": 3072,
- "ven": 3073,
- "Ġ/": 3074,
- "Ġfinancial": 3075,
- "work": 3076,
- "ories": 3077,
- "æ²»": 3078,
- "Ġtechniques": 3079,
- "æĭ¥æľī": 3080,
- "rap": 3081,
- "å°Ķ": 3082,
- "Ġest": 3083,
- "Ġavailable": 3084,
- "Ġlit": 3085,
- "æ¹": 3086,
- "Ġefficient": 3087,
- "els": 3088,
- "over": 3089,
- "Ġland": 3090,
- "Ġarea": 3091,
- "Ġintellig": 3092,
- "Ġpref": 3093,
- "ature": 3094,
- "çŁ¥è¯Ĩ": 3095,
- "æĵįä½ľ": 3096,
- "å¾ħ": 3097,
- "igate": 3098,
- "çļĦæĶ": 3099,
- "Ġmean": 3100,
- "bo": 3101,
- "Ġcontrol": 3102,
- "éĩĩç͍": 3103,
- "ricult": 3104,
- "Ġprogramm": 3105,
- "Ġtowards": 3106,
- "thing": 3107,
- "ä¸įè¦ģ": 3108,
- "Ġthough": 3109,
- "彩": 3110,
- "Ġcertain": 3111,
- "Ġwild": 3112,
- "ä»Ĭ": 3113,
- "Ġconservation": 3114,
- "çŁ¥éģĵ": 3115,
- "Ġreally": 3116,
- "çļĦåľ°": 3117,
- "io": 3118,
- "饰": 3119,
- "Ġful": 3120,
- "çݯä¿Ŀ": 3121,
- "Ġexplore": 3122,
- "çļĦæ¸": 3123,
- "Ġdiverse": 3124,
- "åĬłå¼º": 3125,
- "çļ®": 3126,
- "Ġemotions": 3127,
- "Ġavoid": 3128,
- "'ll": 3129,
- "çļĦæī": 3130,
- "åį¡": 3131,
- "Ġplatform": 3132,
- "ances": 3133,
- "Ġsitu": 3134,
- "ä»ĺ": 3135,
- "ä½įç½®": 3136,
- "oring": 3137,
- "çĽIJ": 3138,
- "ä¸ĩ": 3139,
- "Ġdev": 3140,
- "nov": 3141,
- "ash": 3142,
- "Ġtwo": 3143,
- "å®ł": 3144,
- "bon": 3145,
- "èµ°": 3146,
- "åĪĹ表": 3147,
- "Ġcy": 3148,
- "èįIJ": 3149,
- "ĠSome": 3150,
- "Ġexplain": 3151,
- "Ġaware": 3152,
- "社交": 3153,
- "day": 3154,
- "åıĮ": 3155,
- "æ²ŁéĢļ": 3156,
- "æ°§": 3157,
- "å¼Ģåıij": 3158,
- "åħ¬åı¸çļĦ": 3159,
- "Ġair": 3160,
- "åĩ»": 3161,
- "aring": 3162,
- "éĥ½æĺ¯": 3163,
- "Ġlevels": 3164,
- "ods": 3165,
- "Ġsteps": 3166,
- "Ġcap": 3167,
- "æ´ŀ": 3168,
- "马": 3169,
- "Ġreturn": 3170,
- "Ġmet": 3171,
- "çĶŁæĢģ": 3172,
- "丰å¯Į": 3173,
- "æŁĵ": 3174,
- "æīĢ以": 3175,
- "é¡»": 3176,
- "Ġer": 3177,
- "Ġfra": 3178,
- "30": 3179,
- "èĵ": 3180,
- "âĢĶ": 3181,
- "Ġå½ĵ": 3182,
- "ah": 3183,
- "ä¿ĥ": 3184,
- "Ġlikely": 3185,
- "ĠĠĠĠĠĠĠĠĠĠĠĠĠĠĠĠ": 3186,
- "åĪĿ": 3187,
- "Ġcreating": 3188,
- "Ġfarm": 3189,
- "Ġbal": 3190,
- "Ġlives": 3191,
- "å®ĥçļĦ": 3192,
- "Ġability": 3193,
- "ä¸ĬçļĦ": 3194,
- "Ġsentence": 3195,
- "åĤ¨": 3196,
- "Ġrout": 3197,
- "Ġprovides": 3198,
- "Ġagain": 3199,
- "å®łçī©": 3200,
- "éĢIJ": 3201,
- "Ġyears": 3202,
- "èŀį": 3203,
- "Ġphysical": 3204,
- "Python": 3205,
- "ĠEx": 3206,
- "iting": 3207,
- "è°ĥæķ´": 3208,
- "ç½ij绾": 3209,
- "æħ¢": 3210,
- "空éĹ´": 3211,
- "åĽ°": 3212,
- "è±Ĩ": 3213,
- "æĽ´å¤ļçļĦ": 3214,
- "ĠAr": 3215,
- "Ġmaintain": 3216,
- "å®ŀéĻħ": 3217,
- "Ġtravel": 3218,
- "Ġsat": 3219,
- "pro": 3220,
- "ç͵åŃIJ": 3221,
- "æ±½": 3222,
- "ex": 3223,
- "åģĩ": 3224,
- "æIJŃ": 3225,
- "éļıçĿĢ": 3226,
- "è¿ĺæľī": 3227,
- "礼": 3228,
- "ale": 3229,
- "Ġconsum": 3230,
- "ĊĠ": 3231,
- "ncy": 3232,
- "Ġquestions": 3233,
- "fort": 3234,
- "making": 3235,
- "Ġdesc": 3236,
- "15": 3237,
- "Ġinvolves": 3238,
- "Ġstress": 3239,
- "åŃĹ符": 3240,
- "here": 3241,
- "Ġimpacts": 3242,
- "Ġexercis": 3243,
- "åĿļ": 3244,
- "ledge": 3245,
- "ç§ijæĬĢ": 3246,
- "oci": 3247,
- "Ġeffectively": 3248,
- "æ¶Īè´¹": 3249,
- "Ġconclusion": 3250,
- "éĺħ": 3251,
- "Ġstre": 3252,
- "issions": 3253,
- "æ·»": 3254,
- "It": 3255,
- "éĿĻ": 3256,
- "Ġvirtual": 3257,
- "è¡£": 3258,
- "Ġachieve": 3259,
- "ource": 3260,
- "è¿ŀ": 3261,
- "acks": 3262,
- "è¡¨æł¼": 3263,
- "Ġimportance": 3264,
- "èĩªæĪij": 3265,
- "These": 3266,
- "num": 3267,
- "çļĦæł": 3268,
- "Ġrelationships": 3269,
- "Ġworkers": 3270,
- "gical": 3271,
- "orpor": 3272,
- "erson": 3273,
- "åij¢": 3274,
- "nds": 3275,
- "æİ¨èįIJ": 3276,
- "ohn": 3277,
- "å¿ħé¡»": 3278,
- "容æĺĵ": 3279,
- "ĠGo": 3280,
- "Ġtell": 3281,
- "ĠRes": 3282,
- "onom": 3283,
- "Ġbec": 3284,
- "æ³Ľ": 3285,
- "pos": 3286,
- "Ġmove": 3287,
- "Ġstory": 3288,
- "æŃ¢": 3289,
- "Ġpriorit": 3290,
- "Ġindustries": 3291,
- "èľ": 3292,
- "Ġpossible": 3293,
- "ĠMan": 3294,
- "Ġexpress": 3295,
- "abilities": 3296,
- "Ġintegr": 3297,
- "代表": 3298,
- "Ġrespond": 3299,
- "åĪĨéĴŁ": 3300,
- "æľºä¼ļ": 3301,
- "Ġthings": 3302,
- "交æµģ": 3303,
- "Ġmeth": 3304,
- "urther": 3305,
- "Ġwide": 3306,
- "èijĹ": 3307,
- "æĪijçļĦ": 3308,
- "ĸçķ¥": 3309,
- "ides": 3310,
- "ething": 3311,
- "ĠWhile": 3312,
- "pan": 3313,
- "çŃĸçķ¥": 3314,
- "Ġcent": 3315,
- "Ġplease": 3316,
- "ology": 3317,
- "uracy": 3318,
- "循": 3319,
- "ward": 3320,
- "nce": 3321,
- "Ġthen": 3322,
- "çªģ": 3323,
- "å¥ĩ": 3324,
- "Ġblo": 3325,
- "ai": 3326,
- "æŀĹ": 3327,
- "ç®Ĺæ³ķ": 3328,
- "综": 3329,
- "Ġprint": 3330,
- "aces": 3331,
- "lu": 3332,
- "ªæĸ½": 3333,
- "pre": 3334,
- "çļĦæĦı": 3335,
- "Ġsol": 3336,
- "Ġoverall": 3337,
- "hold": 3338,
- "Ġes": 3339,
- "çļĦä¸Ģ": 3340,
- "éģĩ": 3341,
- "Ġpopul": 3342,
- "å°ı说": 3343,
- "æ³¢": 3344,
- "åįģ": 3345,
- "ä¹Łåı¯ä»¥": 3346,
- "é£Łåĵģ": 3347,
- "Ġcontent": 3348,
- "å°Ħ": 3349,
- "Ġrequires": 3350,
- "æ£ĢæŁ¥": 3351,
- "ĊĠĠĠĠĠĠĠĠĠĠĠ": 3352,
- "Ġgroups": 3353,
- "Ġfair": 3354,
- "Ġbl": 3355,
- "å®ŀéªĮ": 3356,
- "æĮīçħ§": 3357,
- "osp": 3358,
- "str": 3359,
- "ä¸įèĥ½": 3360,
- "Ġharm": 3361,
- "Ġprodu": 3362,
- "çļĦæĬĢ": 3363,
- "çĩ": 3364,
- "tle": 3365,
- "Ġanimals": 3366,
- "è§Ĵèī²": 3367,
- "lev": 3368,
- "æ¸IJ": 3369,
- "å¤įæĿĤ": 3370,
- "Ġdepend": 3371,
- "æĮijæĪĺ": 3372,
- "åĮħåIJ«": 3373,
- "Ġhelps": 3374,
- "Ġopen": 3375,
- "Ġnet": 3376,
- "ĠĠĠĠĠ": 3377,
- "Ġstrong": 3378,
- "Ġjour": 3379,
- "å¹¿æ³Ľ": 3380,
- "æķ´ä¸ª": 3381,
- "Ġelect": 3382,
- "Ġresponse": 3383,
- "åįķè¯į": 3384,
- "æľĭ": 3385,
- "Ġ<": 3386,
- "åĮĸåѦ": 3387,
- "éĴĪ": 3388,
- "Ġquick": 3389,
- "ually": 3390,
- "Ġsomething": 3391,
- "Ġtrack": 3392,
- "度åĴĮ": 3393,
- "erences": 3394,
- "æłij": 3395,
- "Ġaccuracy": 3396,
- "Ġexc": 3397,
- "é£ŀ": 3398,
- "Ġfield": 3399,
- "寻æī¾": 3400,
- "éħ¸": 3401,
- "Ġhope": 3402,
- "çij": 3403,
- "Ġinnov": 3404,
- "绪": 3405,
- "alk": 3406,
- "Ġtypes": 3407,
- "Ġdid": 3408,
- "åĬª": 3409,
- "Ġcall": 3410,
- "è¯Ĺ": 3411,
- "Ġearly": 3412,
- "ĠOne": 3413,
- "app": 3414,
- "Ġcommon": 3415,
- "æľĢç»Ī": 3416,
- "Ġcheck": 3417,
- "Ġsym": 3418,
- "çĤĴ": 3419,
- "æĬĢèĥ½": 3420,
- "Ġenh": 3421,
- "Ġagricult": 3422,
- "Ġimm": 3423,
- "ç»ĩ": 3424,
- "满足": 3425,
- "Ġschool": 3426,
- "bal": 3427,
- "Ġfollowing": 3428,
- "based": 3429,
- "Ġwebs": 3430,
- "Ġculture": 3431,
- "ĠCom": 3432,
- "way": 3433,
- "ä¸Ģå®ļ": 3434,
- "åķĨåĵģ": 3435,
- "ude": 3436,
- "çļĦåıijå±ķ": 3437,
- "çĶŁäº§": 3438,
- "osystem": 3439,
- "Ġplant": 3440,
- "åı¶": 3441,
- "åIJĥ": 3442,
- "ä»ĸçļĦ": 3443,
- "der": 3444,
- "询": 3445,
- "å®¶åħ·": 3446,
- "Ġfree": 3447,
- "ç§»": 3448,
- "æİĮ": 3449,
- "Ġbody": 3450,
- "Ġpresent": 3451,
- "Ġparticularly": 3452,
- "Ġchildren": 3453,
- "Ġstudent": 3454,
- ").": 3455,
- "çī¹å¾ģ": 3456,
- "èĶ": 3457,
- "éĺħ读": 3458,
- "æķĪçİĩ": 3459,
- "Ġprogram": 3460,
- "éħ±": 3461,
- "åıĺå¾Ĺ": 3462,
- "ix": 3463,
- "Ġcome": 3464,
- "çļĦæ²": 3465,
- "ĠTe": 3466,
- "ĠTo": 3467,
- "åħ±åIJĮ": 3468,
- "Ġemployees": 3469,
- "说æĺİ": 3470,
- "Ġheart": 3471,
- "Ġmot": 3472,
- "æľĭåıĭ": 3473,
- "eric": 3474,
- "è¯ij": 3475,
- "Ġcurrent": 3476,
- "æĪIJæľ¬": 3477,
- "Ġtoo": 3478,
- "çݩ家": 3479,
- "åĪĽæĸ°": 3480,
- "Ġecosystem": 3481,
- "常è§ģ": 3482,
- "ä¸ĢæŃ¥": 3483,
- "Ġpres": 3484,
- "Ġmulti": 3485,
- "åijĬè¯ī": 3486,
- "严": 3487,
- "Ġmit": 3488,
- "Ġaction": 3489,
- "çĨŁ": 3490,
- "Ġhabit": 3491,
- "åı£æĦŁ": 3492,
- "ç®±": 3493,
- "Ġuses": 3494,
- "å¢ŀ强": 3495,
- "ç»Ļåĩº": 3496,
- "Ġ9": 3497,
- "Ġdep": 3498,
- "Ġeconomic": 3499,
- "æĢ§çļĦ": 3500,
- "18": 3501,
- "åĨ°": 3502,
- "Ġhelped": 3503,
- "åIJ¸å¼ķ": 3504,
- "çİĭ": 3505,
- "Ġdiagnos": 3506,
- "åł": 3507,
- "èģĶç³»": 3508,
- "群": 3509,
- "ç»ĥä¹ł": 3510,
- "æĪIJéķ¿": 3511,
- "Ġpoint": 3512,
- "å®ļæľŁ": 3513,
- "åij¼": 3514,
- "èį¯": 3515,
- "æĿ¯": 3516,
- "æ¤Ĵ": 3517,
- "æķĪæŀľ": 3518,
- "Ġspecial": 3519,
- "æ··": 3520,
- "åĩłä¸ª": 3521,
- "ause": 3522,
- "éĨ": 3523,
- "æ¯ĶèµĽ": 3524,
- "è·Ŀ": 3525,
- "What": 3526,
- "Ġtimes": 3527,
- "icles": 3528,
- "Ġ*": 3529,
- "ç´§": 3530,
- "å¦Ĥæŀľä½ł": 3531,
- "çĭ¬çī¹": 3532,
- "çģµ": 3533,
- "ç¨İ": 3534,
- "Ġcarbon": 3535,
- "Ġbias": 3536,
- "åĬ©äºİ": 3537,
- "Ġconst": 3538,
- "èĩªçͱ": 3539,
- "æĿ¥è¯´": 3540,
- "å°±æĺ¯": 3541,
- "åį°": 3542,
- "Ġmeet": 3543,
- "è§ĦåĪĴ": 3544,
- "çļĦç¾": 3545,
- "èIJ¥åħ»": 3546,
- "ators": 3547,
- "稳å®ļ": 3548,
- "ode": 3549,
- "çħ®": 3550,
- "Ġassoci": 3551,
- "å¿Ĺ": 3552,
- "è¡ĮæĺŁ": 3553,
- "æĿİ": 3554,
- "Ġreview": 3555,
- "åĩĢ": 3556,
- "ĠRo": 3557,
- "Ġknowledge": 3558,
- "以便": 3559,
- "æµĭè¯ķ": 3560,
- "åIJĪéĢĤ": 3561,
- "sc": 3562,
- "å½¢å¼ı": 3563,
- "Ġfriends": 3564,
- "Ġnature": 3565,
- "Ġcritical": 3566,
- "æ´ĭ": 3567,
- "Ġafter": 3568,
- "erve": 3569,
- "Ġrece": 3570,
- "çļĦæŃ": 3571,
- "汽车": 3572,
- "çķĮ": 3573,
- "Ġloss": 3574,
- "Ġapplications": 3575,
- "å¤ļç§į": 3576,
- "éĶħ": 3577,
- "串": 3578,
- "Ġinsp": 3579,
- "---": 3580,
- "ĠSh": 3581,
- "Ġvol": 3582,
- "lut": 3583,
- "oks": 3584,
- "sequ": 3585,
- "Ġbir": 3586,
- "åIJĪçIJĨ": 3587,
- "Ġnecess": 3588,
- "æĪijæĥ³": 3589,
- "çŃīæĸ¹éĿ¢": 3590,
- "é¼ĵ": 3591,
- "Ġsoft": 3592,
- "Ġlive": 3593,
- "å°ıæĺİ": 3594,
- "ĠInd": 3595,
- "Ġbring": 3596,
- "æĺ¯æĮĩ": 3597,
- "Ġsoil": 3598,
- "ilar": 3599,
- "举": 3600,
- "æĿ¡ä»¶": 3601,
- "Ġtri": 3602,
- "亮": 3603,
- "Ġmom": 3604,
- "æı¡": 3605,
- "ä¼°": 3606,
- "ŀäºī": 3607,
- "çĽij": 3608,
- "èĤ¤": 3609,
- "è´¢åĬ¡": 3610,
- "æ·»åĬł": 3611,
- "é¥®é£Ł": 3612,
- "Ġallowing": 3613,
- "åºķ": 3614,
- "Ġright": 3615,
- "Ġexpert": 3616,
- "Ġsupp": 3617,
- "Ġinit": 3618,
- "çļĦæµ": 3619,
- "arget": 3620,
- "Ġexpect": 3621,
- "Ġ19": 3622,
- "Ġmeasures": 3623,
- "olutions": 3624,
- "just": 3625,
- "arc": 3626,
- "å°ļ": 3627,
- "Ġpractice": 3628,
- "æľīåĬ©äºİ": 3629,
- "大éĩı": 3630,
- "',": 3631,
- "iment": 3632,
- "Ġcontinue": 3633,
- "Ġdiscuss": 3634,
- "100": 3635,
- "éļľ": 3636,
- "çļĦæĦŁ": 3637,
- "Ġreflect": 3638,
- "itation": 3639,
- "åį«": 3640,
- "äºĨä¸Ģ": 3641,
- "ney": 3642,
- "ĠLe": 3643,
- "ised": 3644,
- "è¶ĭ": 3645,
- "äºĨä¸Ģ个": 3646,
- "Ġincreasing": 3647,
- "çļĦæĮ": 3648,
- "Ġstru": 3649,
- "æĢ»ç»ĵ": 3650,
- "ely": 3651,
- "å®ĩ": 3652,
- "Ġauthor": 3653,
- "表éĿ¢": 3654,
- "Ġx": 3655,
- "æķħäºĭ": 3656,
- "emic": 3657,
- "Ġrepresent": 3658,
- "ger": 3659,
- "Ġincreased": 3660,
- "ones": 3661,
- "ains": 3662,
- "Ġtrained": 3663,
- "Ġfish": 3664,
- "Ġstate": 3665,
- "åĨ·": 3666,
- "çĶŁéķ¿": 3667,
- "Ġrenew": 3668,
- "ording": 3669,
- "åĮĹ": 3670,
- "æİªæĸ½": 3671,
- "平衡": 3672,
- "Ġsuccessful": 3673,
- "ä¸ĭéĿ¢": 3674,
- "Ġactivity": 3675,
- "èĮ¶": 3676,
- "éĢĤåºĶ": 3677,
- "èĦij": 3678,
- "æİ¢ç´¢": 3679,
- "ffic": 3680,
- "ç»ĦæĪIJ": 3681,
- "atives": 3682,
- "äºļ": 3683,
- "Ġscen": 3684,
- "æ²Ļ": 3685,
- "gress": 3686,
- "使å¾Ĺ": 3687,
- "æī¿": 3688,
- "Ġdiscrim": 3689,
- "Ġassistants": 3690,
- "Ġexist": 3691,
- "çķĻ": 3692,
- "Ġspace": 3693,
- "æľĢè¿ij": 3694,
- "Ġideas": 3695,
- "éĩĩåıĸ": 3696,
- "light": 3697,
- "注éĩį": 3698,
- "çļĦæĹ¶éĹ´": 3699,
- "è¿İ": 3700,
- "Ġcomb": 3701,
- "éĢĤå½ĵ": 3702,
- "Ġyourself": 3703,
- "rite": 3704,
- "ason": 3705,
- "åĮĢ": 3706,
- "åı¯ä»¥ä½¿ç͍": 3707,
- "åħħ满": 3708,
- "Ġvalues": 3709,
- "æ½": 3710,
- "Ġbiases": 3711,
- "ä¿ĥè¿Ľ": 3712,
- "åľºæĻ¯": 3713,
- "ross": 3714,
- "åį³åı¯": 3715,
- "Ġcru": 3716,
- "Ġnumber": 3717,
- "Ġtype": 3718,
- "rast": 3719,
- "åĩĨç¡®": 3720,
- "This": 3721,
- "Ġpast": 3722,
- "çģ¯": 3723,
- "å®ļä¹ī": 3724,
- "Ġsolutions": 3725,
- "Ġter": 3726,
- "ä¿Ŀè¯ģ": 3727,
- "èͬ": 3728,
- "幸": 3729,
- "åī§": 3730,
- "åħ´è¶£": 3731,
- "åª": 3732,
- "ention": 3733,
- "avor": 3734,
- "Ġscient": 3735,
- "åĬªåĬĽ": 3736,
- "Ġproviders": 3737,
- "Ġpolicies": 3738,
- "alu": 3739,
- "ĠIm": 3740,
- "Ġallows": 3741,
- "Ġintelligence": 3742,
- "çļĦæĸ¹æ³ķ": 3743,
- "è¿Ļæĺ¯": 3744,
- "Ġ`": 3745,
- "Ġemissions": 3746,
- "Ġå°Ĩ": 3747,
- "Ġmeaning": 3748,
- "Ġstyle": 3749,
- "åİŁåĽł": 3750,
- "Ġstrugg": 3751,
- "çļĦç¾İ": 3752,
- "iful": 3753,
- "dition": 3754,
- "éĥ½æľī": 3755,
- "空æ°Ķ": 3756,
- "å®ĥ们çļĦ": 3757,
- "ä¼ĺåĮĸ": 3758,
- "Ġinflu": 3759,
- "åŁºäºİ": 3760,
- "Ġdetails": 3761,
- "Ġtransparency": 3762,
- "Ġmess": 3763,
- "ĠCl": 3764,
- "Ġgame": 3765,
- "pri": 3766,
- "è¶ĭåĬ¿": 3767,
- "å½Ĵ": 3768,
- "ç¿»è¯ij": 3769,
- "æķ£": 3770,
- "By": 3771,
- "éŃ": 3772,
- "ĠAmeric": 3773,
- "Ġproduction": 3774,
- "Ġincorpor": 3775,
- "æĻļ": 3776,
- "Ġinvolve": 3777,
- "Ġhot": 3778,
- "æĻ®": 3779,
- "by": 3780,
- "Ġflow": 3781,
- "Ġemerg": 3782,
- "座": 3783,
- "Ġidea": 3784,
- "åİĭåĬĽ": 3785,
- "éĿĴ": 3786,
- "oms": 3787,
- "èģĮä¸ļ": 3788,
- "Ġreport": 3789,
- "Ġpap": 3790,
- "Ġtherap": 3791,
- "Ġsal": 3792,
- "åıĤä¸İ": 3793,
- "æĸĩåѦ": 3794,
- "æIJŃéħį": 3795,
- "oot": 3796,
- "),": 3797,
- "Ġcr": 3798,
- "Ġprocesses": 3799,
- "gin": 3800,
- "å¹³åı°": 3801,
- "å¯Ł": 3802,
- "Ġpromoting": 3803,
- "æļĸ": 3804,
- "akehold": 3805,
- "ç»§": 3806,
- "iver": 3807,
- "æ¦Ĥ": 3808,
- "Ġmodels": 3809,
- "Ġdra": 3810,
- "èĸ": 3811,
- "Ġgroup": 3812,
- "è¶³å¤Ł": 3813,
- "Ġgreen": 3814,
- "Ġhealthy": 3815,
- "Ġcomfort": 3816,
- "Ġadditional": 3817,
- "ä¸Ģ次": 3818,
- "é¤IJåİħ": 3819,
- "Ġmaterials": 3820,
- "Ġmanage": 3821,
- "çļĦæ¯": 3822,
- "伤": 3823,
- "åıĬæĹ¶": 3824,
- "Ġglo": 3825,
- "Ġstat": 3826,
- "å¿«éĢŁ": 3827,
- "Ġmonitoring": 3828,
- "aily": 3829,
- "rand": 3830,
- "oice": 3831,
- "resh": 3832,
- "ç»Ħç»ĩ": 3833,
- "Ġunder": 3834,
- "Ġnecessary": 3835,
- "Ġhelpful": 3836,
- "ĠCol": 3837,
- "é»ijæ´ŀ": 3838,
- "åģļåĩº": 3839,
- "Ġcourse": 3840,
- "Ġmat": 3841,
- "Ġleg": 3842,
- "Ġface": 3843,
- "令": 3844,
- "èī¯å¥½çļĦ": 3845,
- "ock": 3846,
- "åĮ»çĸĹ": 3847,
- "çĽĸ": 3848,
- "idence": 3849,
- "Ġassociated": 3850,
- "Ġprogress": 3851,
- "åľĨ": 3852,
- "Ġeveryone": 3853,
- "ç¼ĵ": 3854,
- "ĠEng": 3855,
- "word": 3856,
- "èĵĿ": 3857,
- "天æ°Ķ": 3858,
- "Ġactions": 3859,
- "ems": 3860,
- "ĠPl": 3861,
- "å®Ļ": 3862,
- "ush": 3863,
- "顾": 3864,
- "Ġcosts": 3865,
- "ator": 3866,
- "ç©¿": 3867,
- "Ġamounts": 3868,
- "èͬèıľ": 3869,
- "..": 3870,
- "Ġmanner": 3871,
- "Ġconsequ": 3872,
- "æ°ĶåĢĻ": 3873,
- "Ġinsights": 3874,
- "being": 3875,
- "atory": 3876,
- "ener": 3877,
- "lex": 3878,
- "Ġmeans": 3879,
- "Ġcollaboration": 3880,
- "Ġperspect": 3881,
- "orm": 3882,
- "priate": 3883,
- "å°Ĭéĩį": 3884,
- "Ġtarget": 3885,
- "è®°å½ķ": 3886,
- "åĢĴ": 3887,
- "Ġrenewable": 3888,
- "æĦ¿": 3889,
- "èĥ½æºIJ": 3890,
- "Ġinput": 3891,
- "å®ĩå®Ļ": 3892,
- "ape": 3893,
- "Ġadjust": 3894,
- "eries": 3895,
- "Ġdire": 3896,
- "ä¾Ŀ": 3897,
- "ustr": 3898,
- "fect": 3899,
- "Ġbeautiful": 3900,
- "Ġdue": 3901,
- "reci": 3902,
- "çĮ®": 3903,
- "èĥĮæĻ¯": 3904,
- "èĤ¡": 3905,
- "Ġdam": 3906,
- "ik": 3907,
- "Ġadvanced": 3908,
- "çĽ¸å¯¹": 3909,
- "åIJįç§°": 3910,
- "Ġshort": 3911,
- "Ġobject": 3912,
- "è¿ĻéĩĮ": 3913,
- "éĢłæĪIJ": 3914,
- "èIJ¥éĶĢ": 3915,
- "çļĦæĥħæĦŁ": 3916,
- "票": 3917,
- "Ġcountries": 3918,
- "ining": 3919,
- "istic": 3920,
- "Ġplans": 3921,
- "责任": 3922,
- "Ġstakehold": 3923,
- "the": 3924,
- "Ġassess": 3925,
- "æĢĿèĢĥ": 3926,
- "ech": 3927,
- "æĪIJåijĺ": 3928,
- "21": 3929,
- "Ġdaily": 3930,
- "Ġcomput": 3931,
- "çļĦæĥħåĨµ": 3932,
- "æıIJåĩº": 3933,
- "ĠâĢľ": 3934,
- "åªĴ": 3935,
- "ä¸Ńå¿ĥ": 3936,
- "ished": 3937,
- "ĠSe": 3938,
- "onomous": 3939,
- "ern": 3940,
- "ç»´æĬ¤": 3941,
- "ames": 3942,
- "Ġprioritize": 3943,
- "纸": 3944,
- "èĤ¥": 3945,
- "Ġtemper": 3946,
- "æ¸ħæ´ģ": 3947,
- "use": 3948,
- "污": 3949,
- "Ġminim": 3950,
- "æĺ¯åľ¨": 3951,
- "大å°ı": 3952,
- "åĵªäºĽ": 3953,
- "Ġappreci": 3954,
- "reng": 3955,
- "Ġregulations": 3956,
- "ĠZ": 3957,
- "éĶĻ误": 3958,
- "rans": 3959,
- "èĢĮä¸Ķ": 3960,
- "èά": 3961,
- "èij±": 3962,
- "èĨ": 3963,
- "æ°´å¹³": 3964,
- "è´Ńçī©": 3965,
- "åŃĹ符串": 3966,
- "对æĸ¹": 3967,
- "Ġhim": 3968,
- "Ġconsequences": 3969,
- "å·´": 3970,
- "é¼ĵåĬ±": 3971,
- "Ġfil": 3972,
- "人åijĺ": 3973,
- "è·Ŀ离": 3974,
- "ĠWhen": 3975,
- "çļĦæ°´": 3976,
- "çī©çIJĨ": 3977,
- "åIJĮæĹ¶ä¹Ł": 3978,
- "åľ¨è¿Ļ个": 3979,
- "åħ¶æ¬¡": 3980,
- ",\"": 3981,
- "æ¶²": 3982,
- "çĶ·": 3983,
- "ival": 3984,
- "åı¯ä»¥è®©": 3985,
- "æĥ¯": 3986,
- "Ġadvance": 3987,
- "Ġveh": 3988,
- "å¦ĤæŀľæĤ¨": 3989,
- "Ġestab": 3990,
- "ript": 3991,
- "端": 3992,
- "ä¸įä¼ļ": 3993,
- "Ġtransparent": 3994,
- "æķ°éĩı": 3995,
- "çĽĺ": 3996,
- "Ġspeak": 3997,
- "Ġpark": 3998,
- "Ġstakeholders": 3999,
- "éº": 4000,
- "Ġevent": 4001,
- "çļĦæķ°æį®": 4002,
- "èĩªåĬ¨": 4003,
- "ç»ĨèĬĤ": 4004,
- "è¯Ħä¼°": 4005,
- "润": 4006,
- "Ġpreferences": 4007,
- "Ġveget": 4008,
- "æįŁ": 4009,
- "equ": 4010,
- "Ġgl": 4011,
- "Ġpain": 4012,
- "ogra": 4013,
- "Ġtraffic": 4014,
- "Ġoce": 4015,
- "ä¹ĺ": 4016,
- "ext": 4017,
- "âĢĿï¼Į": 4018,
- "Ġanother": 4019,
- "å¤ļå°ij": 4020,
- "Ġagainst": 4021,
- "ç»ıåİĨ": 4022,
- "计ç®Ĺæľº": 4023,
- "èĢIJ": 4024,
- "软件": 4025,
- "ĠPre": 4026,
- "Ġplants": 4027,
- "缸äºĴ": 4028,
- "é¢ij": 4029,
- "\\_": 4030,
- "Ġsame": 4031,
- "rug": 4032,
- "Ġvalu": 4033,
- "Ġocc": 4034,
- "çļĦç¤": 4035,
- "Ġsustainability": 4036,
- "ĠShe": 4037,
- "de": 4038,
- "ote": 4039,
- "Ġdig": 4040,
- "NA": 4041,
- "Ġcrucial": 4042,
- "æī§": 4043,
- "å±Ģ": 4044,
- "æĭŁ": 4045,
- "æĭĮ": 4046,
- "Ġnon": 4047,
- "Ġengaging": 4048,
- "Ġintern": 4049,
- "LP": 4050,
- "温度": 4051,
- "æł¸": 4052,
- "æĬ¥åijĬ": 4053,
- "æĿ¥è¶Ĭ": 4054,
- "hood": 4055,
- "ä¸ī个": 4056,
- "å¦Ĥä¸ĭ": 4057,
- "çī©ä½ĵ": 4058,
- "force": 4059,
- "Ġneeded": 4060,
- "Ġimages": 4061,
- "Ġbuilding": 4062,
- "icious": 4063,
- "ĠæĪij": 4064,
- "è¶ĬæĿ¥è¶Ĭ": 4065,
- "æĶ¾åħ¥": 4066,
- "go": 4067,
- "éĻįä½İ": 4068,
- "å½ĵåľ°": 4069,
- "æ¶Īè´¹èĢħ": 4070,
- "ç£": 4071,
- "iversity": 4072,
- "é¢Ħç®Ĺ": 4073,
- "icle": 4074,
- "æ··åIJĪ": 4075,
- "Ġparticip": 4076,
- "Ġdishes": 4077,
- "Ġthroughout": 4078,
- "Ġwithin": 4079,
- "åı³": 4080,
- "é«ĺçļĦ": 4081,
- "Ġphot": 4082,
- "Ġtrust": 4083,
- "æĦıè¯Ĩ": 4084,
- "以确ä¿Ŀ": 4085,
- "çĬ¶æĢģ": 4086,
- "Ġautomation": 4087,
- "11": 4088,
- "Ġpost": 4089,
- "æīĭæľº": 4090,
- "works": 4091,
- "éĢı": 4092,
- "åºĵ": 4093,
- "Ġwind": 4094,
- "Ġ==": 4095,
- "Ġprocessing": 4096,
- "èĮĥåĽ´": 4097,
- "æĦıä¹ī": 4098,
- "追æ±Ĥ": 4099,
- "é": 4100,
- "å¾Ħ": 4101,
- "éĿł": 4102,
- "ä¸ĸ": 4103,
- "èϽ": 4104,
- "ç«ŀäºī": 4105,
- "Ġappropriate": 4106,
- "æĽ´å¥½çļĦ": 4107,
- "Ġcharacter": 4108,
- "cl": 4109,
- "ç§ĺ": 4110,
- "itude": 4111,
- "Ġteac": 4112,
- "leep": 4113,
- "ĠDevelop": 4114,
- "ince": 4115,
- "å·¦": 4116,
- "ground": 4117,
- "è¡Įä¸ļ": 4118,
- "éĴĪ对": 4119,
- "å¿ħè¦ģ": 4120,
- "Ġdeterm": 4121,
- "----------------": 4122,
- "Ġstreng": 4123,
- "do": 4124,
- "Ġchallenging": 4125,
- "ork": 4126,
- "Ġanx": 4127,
- "èī²çļĦ": 4128,
- "Ġhard": 4129,
- "æĺİç¡®": 4130,
- "åĪĨ享": 4131,
- "æĶ¹åıĺ": 4132,
- "ä½³": 4133,
- "åıªæľī": 4134,
- "å±ķ示": 4135,
- "Ġcamp": 4136,
- "纳": 4137,
- "aj": 4138,
- "etic": 4139,
- "ument": 4140,
- "ä½łåı¯ä»¥": 4141,
- "Ġpollut": 4142,
- "Ġhig": 4143,
- "pping": 4144,
- "ead": 4145,
- "çĦ¶èĢĮ": 4146,
- "第äºĮ": 4147,
- "鸣": 4148,
- "çī©åĵģ": 4149,
- "举": 4150,
- "Ġencourage": 4151,
- "pecial": 4152,
- "Ġacross": 4153,
- "elves": 4154,
- "äºĭä»¶": 4155,
- "cle": 4156,
- "æ©": 4157,
- "åªĴä½ĵ": 4158,
- "ners": 4159,
- "Ġcal": 4160,
- "èϽçĦ¶": 4161,
- "åĽº": 4162,
- "ä¹łæĥ¯": 4163,
- "Ġsafe": 4164,
- "èĥ½éĩı": 4165,
- "istics": 4166,
- "ä¹ĭåīį": 4167,
- "Ġissue": 4168,
- "å¤ļ个": 4169,
- "åĨ³çŃĸ": 4170,
- "è¾¾åΰ": 4171,
- "æĹ©": 4172,
- "ä¸įåı¯": 4173,
- "ä¸Ģ缴": 4174,
- "å·¨": 4175,
- "æĦŁè°¢": 4176,
- "ĠNew": 4177,
- "ä¸Ģ段": 4178,
- "Ġmachines": 4179,
- "å°Ĩåħ¶": 4180,
- "ç»§ç»Ń": 4181,
- "Ġword": 4182,
- "çī¹åĪ«": 4183,
- "Ġagriculture": 4184,
- "æĢİ": 4185,
- "éĢIJæ¸IJ": 4186,
- "éĵ¾": 4187,
- "课": 4188,
- "Ġkind": 4189,
- "å¢Ļ": 4190,
- "谢谢": 4191,
- "Ġalgorithm": 4192,
- "è£ħ饰": 4193,
- "Ġalong": 4194,
- "Ġeasy": 4195,
- "äºij": 4196,
- "è§£åĨ³æĸ¹æ¡Ī": 4197,
- "Ġawareness": 4198,
- "'ve": 4199,
- "æĸ¹åIJij": 4200,
- "Ġnever": 4201,
- "Ġquickly": 4202,
- "Ġrespect": 4203,
- "çļĦæĻ": 4204,
- "Ġamong": 4205,
- "Ġaccountability": 4206,
- "Ġlaw": 4207,
- "ening": 4208,
- "Ġdefin": 4209,
- "Ġsurround": 4210,
- "éĵģ": 4211,
- "Ġpowerful": 4212,
- "An": 4213,
- "Ġcause": 4214,
- "æ¥": 4215,
- "æİĮæı¡": 4216,
- "è¿ĺæĺ¯": 4217,
- "Ġcreative": 4218,
- "è¡Ģ": 4219,
- "Ġlocated": 4220,
- "unning": 4221,
- "åľ°åĮº": 4222,
- "éĿ¢ç§¯": 4223,
- "鼨": 4224,
- "Ġnear": 4225,
- "Ġiniti": 4226,
- "ression": 4227,
- "ä¸ĭæĿ¥": 4228,
- "25": 4229,
- "é©¶": 4230,
- "¾çĹħ": 4231,
- "ables": 4232,
- "æľīè¶£": 4233,
- "循çݯ": 4234,
- "çŃĶæ¡Ī": 4235,
- "çł´": 4236,
- "ication": 4237,
- "éĻ¢": 4238,
- "æ²»çĸĹ": 4239,
- "Ġaddition": 4240,
- "äºĭæĥħ": 4241,
- "Ġbecause": 4242,
- "åıĪ": 4243,
- "èĤĮ": 4244,
- "纪": 4245,
- "side": 4246,
- "æĭħ": 4247,
- "湿": 4248,
- "åįĬ": 4249,
- "顺": 4250,
- "ĠAnd": 4251,
- "Ġrestaurant": 4252,
- "Ġvide": 4253,
- "Ġproblem": 4254,
- "azing": 4255,
- "Ġmembers": 4256,
- "Ġnut": 4257,
- "Ġcou": 4258,
- "浪": 4259,
- "Ġè¿Ļ": 4260,
- "Ġhelping": 4261,
- "ĠIs": 4262,
- "æıIJåįĩ": 4263,
- "ĠĠĠĠĠĠ": 4264,
- "Ġsho": 4265,
- "Ġrelev": 4266,
- "Ġarg": 4267,
- "Ġbalance": 4268,
- "illed": 4269,
- "æĺ¯ä»Ģä¹Ī": 4270,
- "åĬĽéĩı": 4271,
- "ired": 4272,
- "å¤ľ": 4273,
- "åı¯æĮģç»Ń": 4274,
- "Ġperfect": 4275,
- "**": 4276,
- "ification": 4277,
- "æ¶ī": 4278,
- "Ġwildlife": 4279,
- "ane": 4280,
- "Ġrelated": 4281,
- "室åĨħ": 4282,
- "åºľ": 4283,
- "享åıĹ": 4284,
- "ours": 4285,
- "è·ij": 4286,
- "åķĨä¸ļ": 4287,
- "aching": 4288,
- "Ġsun": 4289,
- "Ġrecognition": 4290,
- "elt": 4291,
- "Ġorder": 4292,
- "å¹³åĿĩ": 4293,
- "ging": 4294,
- "临": 4295,
- "çĤ¼": 4296,
- "Ġgoing": 4297,
- "åij¼åIJ¸": 4298,
- "Ġsoftware": 4299,
- "Ġremot": 4300,
- "èijĹåIJį": 4301,
- "幸ç¦ı": 4302,
- "Ġenhance": 4303,
- "èĻļ": 4304,
- "Ġnow": 4305,
- "Ġthreat": 4306,
- "Ġdest": 4307,
- "åĿĩåĮĢ": 4308,
- "Ġacad": 4309,
- "åºĶ对": 4310,
- "çľĭåΰ": 4311,
- "cast": 4312,
- "è¾Ĩ": 4313,
- "ificial": 4314,
- "Ġvery": 4315,
- "ook": 4316,
- "åĮºåŁŁ": 4317,
- "¹ģ": 4318,
- "æĪ¿éĹ´": 4319,
- "æıIJä¾ĽäºĨ": 4320,
- "Ġmotiv": 4321,
- "Ġaccessible": 4322,
- "åĨ³å®ļ": 4323,
- "Ġhy": 4324,
- "å®Ī": 4325,
- "Ġflo": 4326,
- "ug": 4327,
- "Ġinformed": 4328,
- "åĵģè´¨": 4329,
- "çļĦçŁ": 4330,
- "aves": 4331,
- "arr": 4332,
- "ĠWith": 4333,
- "let": 4334,
- "è§ĤçĤ¹": 4335,
- "enge": 4336,
- "è¡ĮåĬ¨": 4337,
- "friend": 4338,
- "ç³ķ": 4339,
- "Ġfurther": 4340,
- "ĠEns": 4341,
- "ç§ģ": 4342,
- "Ġado": 4343,
- "Ġclean": 4344,
- "缸åºĶ": 4345,
- "Ġfre": 4346,
- "pecially": 4347,
- "èĹ": 4348,
- "Ġcapt": 4349,
- "çļĦçľ": 4350,
- "Ġsomeone": 4351,
- "Ġcell": 4352,
- "æĶ¾åľ¨": 4353,
- "欢è¿İ": 4354,
- "ĠâĢ": 4355,
- "Ġdevices": 4356,
- "çļĦæĸ¹å¼ı": 4357,
- "Ġjobs": 4358,
- "augh": 4359,
- "not": 4360,
- "æľīäºĽ": 4361,
- "åħ¬åħ±": 4362,
- "gest": 4363,
- "çļĦçĶŁæ´»": 4364,
- "çľ¼": 4365,
- "çļĦä¿¡æģ¯": 4366,
- "ĠCons": 4367,
- "æİĴåºı": 4368,
- "Ġbenefit": 4369,
- "rect": 4370,
- "å¤ı": 4371,
- "unte": 4372,
- "符åIJĪ": 4373,
- "ä¸Ģä½į": 4374,
- "åĨħéĥ¨": 4375,
- "Ġlooking": 4376,
- "ding": 4377,
- "æĬĺ": 4378,
- "è¾ij": 4379,
- "è¿Ļ个éĹ®é¢ĺ": 4380,
- "Ġespecially": 4381,
- "çľł": 4382,
- "âĢĿãĢĤ": 4383,
- "å¥ı": 4384,
- "ray": 4385,
- "è¿ĺåı¯ä»¥": 4386,
- "åĪĽä½ľ": 4387,
- "coming": 4388,
- "Ġmultiple": 4389,
- "éļIJ": 4390,
- "泡": 4391,
- "æłĩåĩĨ": 4392,
- "Ġmil": 4393,
- "éľĢè¦ģ注æĦı": 4394,
- "Ġanxiety": 4395,
- "æĶ¹è¿Ľ": 4396,
- "å±ĭ": 4397,
- "污æŁĵ": 4398,
- "ç¼ĸç¨ĭ": 4399,
- "è´¹ç͍": 4400,
- "Ġevalu": 4401,
- "imately": 4402,
- "Ġliter": 4403,
- "ograph": 4404,
- "Ġsearch": 4405,
- "16": 4406,
- "enced": 4407,
- "Ġmethods": 4408,
- "çĥĪ": 4409,
- "模å¼ı": 4410,
- "çĬ¶åĨµ": 4411,
- "æĶ¹åĸĦ": 4412,
- "å¤ļæł·": 4413,
- "cer": 4414,
- "å¥ĸ": 4415,
- "Ġsatis": 4416,
- "Ġwebsite": 4417,
- "åĬŀ": 4418,
- "åģ¥èº«": 4419,
- "Ġglobal": 4420,
- "Ġask": 4421,
- "Ġplatforms": 4422,
- "Ġdiseases": 4423,
- "çݰ象": 4424,
- "tics": 4425,
- "æ±ģ": 4426,
- "åΤæĸŃ": 4427,
- "Ġconvers": 4428,
- "Ġrelationship": 4429,
- "设置": 4430,
- "æ³ķå¾ĭ": 4431,
- "Ġmindful": 4432,
- "é¢Ħæµĭ": 4433,
- "overy": 4434,
- "åģľ": 4435,
- "ç͵è§Ĩ": 4436,
- "è§ĦåĪĻ": 4437,
- "aken": 4438,
- "Ġimplementing": 4439,
- "ising": 4440,
- "åıĤåĬł": 4441,
- "æĥħ绪": 4442,
- "Ġprovided": 4443,
- "æ·±åħ¥": 4444,
- "Ġprogrammed": 4445,
- "Ġrelevant": 4446,
- "çļĦçĥ": 4447,
- "çĸ¾çĹħ": 4448,
- "åĮ»çĶŁ": 4449,
- "åĪĽå»º": 4450,
- "Ġgenerate": 4451,
- "æĶ¶åħ¥": 4452,
- "ä¼ij": 4453,
- "izes": 4454,
- "Ġtransform": 4455,
- "éģµ": 4456,
- "astic": 4457,
- "åijĪ": 4458,
- "æ¯ı个人": 4459,
- "è¿Ķ": 4460,
- "iet": 4461,
- "Ġvoice": 4462,
- "éĢĶ": 4463,
- "æĶ¾æĿ¾": 4464,
- "åį´": 4465,
- "èĥľ": 4466,
- "Ġstructure": 4467,
- "æĹ¶å°ļ": 4468,
- "ĠQ": 4469,
- "Ġelse": 4470,
- "duc": 4471,
- "Ġemp": 4472,
- "èģļ": 4473,
- "è´§": 4474,
- "aches": 4475,
- "ç§Ģ": 4476,
- "anks": 4477,
- "Ġnight": 4478,
- "Ġprofessionals": 4479,
- "Ġbas": 4480,
- "è´µ": 4481,
- "ec": 4482,
- "Ġdiversity": 4483,
- "ites": 4484,
- "dr": 4485,
- "åĽ°éļ¾": 4486,
- "ĥåľ": 4487,
- "åŀĥåľ": 4488,
- "åŀĥåľ¾": 4489,
- "Ġdrug": 4490,
- "碳": 4491,
- "Ġname": 4492,
- "åĮĸçļĦ": 4493,
- "aid": 4494,
- "æľĢ大": 4495,
- "æijĦ": 4496,
- "ç®ĢåįķçļĦ": 4497,
- "Ġwarm": 4498,
- "Ġdone": 4499,
- "Ġfunction": 4500,
- "asc": 4501,
- "强è°ĥ": 4502,
- "Ġdemand": 4503,
- "Ġvisual": 4504,
- "Ġupd": 4505,
- "æŃ£åľ¨": 4506,
- "Ġsimilar": 4507,
- "éĢĴ": 4508,
- "æ¯Ľ": 4509,
- "éĶ»": 4510,
- "ently": 4511,
- "Ġvaluable": 4512,
- "Ġdisaster": 4513,
- "ä¸Ģèά": 4514,
- "æ´²": 4515,
- "ĠReg": 4516,
- "Ġdiscrimination": 4517,
- "åĨĻä¸Ģç¯ĩ": 4518,
- "Ġgovernment": 4519,
- "Ġ好çļĦ": 4520,
- "500": 4521,
- "lying": 4522,
- "Ġprev": 4523,
- "Ġprepare": 4524,
- "Ġproblems": 4525,
- "è·³": 4526,
- "Ġprom": 4527,
- "åĨ²": 4528,
- "å®īè£ħ": 4529,
- "éĶ»çĤ¼": 4530,
- "æµĵ": 4531,
- "è¹": 4532,
- "åºĶç͍ç¨ĭåºı": 4533,
- "ng": 4534,
- "Ġcompet": 4535,
- "åĪĨåĪ«": 4536,
- "ological": 4537,
- "审": 4538,
- "Ġtransl": 4539,
- "Ġdirect": 4540,
- "åīĤ": 4541,
- "Ġsuggestions": 4542,
- "Ġpaper": 4543,
- "Ġrecognize": 4544,
- "ton": 4545,
- "Ġmitigate": 4546,
- "讨论": 4547,
- "äºĴåĬ¨": 4548,
- "ĠEar": 4549,
- "Ġamazing": 4550,
- "cre": 4551,
- "é¦Ī": 4552,
- "Ġinvolved": 4553,
- "face": 4554,
- "æľīåħ³": 4555,
- "))": 4556,
- "Ġexce": 4557,
- "Ġproductivity": 4558,
- "èŃ": 4559,
- "é¦Ĩ": 4560,
- "Ġsounds": 4561,
- "Ġidentifying": 4562,
- "],": 4563,
- "é¾Ļ": 4564,
- "Ġfit": 4565,
- "Ġcontribute": 4566,
- "ths": 4567,
- "friendly": 4568,
- "ele": 4569,
- "ified": 4570,
- "iveness": 4571,
- "itely": 4572,
- "ĠX": 4573,
- "Ġled": 4574,
- "åĿı": 4575,
- "Ġhistor": 4576,
- "Ġdat": 4577,
- "Ġjourney": 4578,
- "Ġ}": 4579,
- "Ġselect": 4580,
- "漫": 4581,
- "Ġconduct": 4582,
- "è¿Ľä¸ĢæŃ¥": 4583,
- "ç»ĻæĪij": 4584,
- "Ġlif": 4585,
- "è£ħä¿®": 4586,
- "为ä»Ģä¹Ī": 4587,
- "京": 4588,
- "Ġnav": 4589,
- "Ġwhole": 4590,
- "ç¹ģ": 4591,
- "åĨľ": 4592,
- "æĶ»": 4593,
- "Ġbreat": 4594,
- "Ġmiss": 4595,
- "é¾Ħ": 4596,
- "tt": 4597,
- "sw": 4598,
- "Ġbar": 4599,
- "请éĹ®": 4600,
- "èģĶç½ij": 4601,
- "Ġattract": 4602,
- "æĤ¨åı¯ä»¥": 4603,
- "One": 4604,
- "åħħåĪĨ": 4605,
- "ring": 4606,
- "Ġå½ĵçĦ¶": 4607,
- "ream": 4608,
- "Ġevol": 4609,
- "Ġsn": 4610,
- "ĠEm": 4611,
- "mosp": 4612,
- "Ġchoose": 4613,
- "view": 4614,
- "Ġarr": 4615,
- "Ġsleep": 4616,
- "ended": 4617,
- "æŀ¶": 4618,
- "Ġvehicles": 4619,
- "Ġfresh": 4620,
- "Ġorganization": 4621,
- "è¿Ļ段": 4622,
- "汤": 4623,
- "ĠInt": 4624,
- "Ġcontext": 4625,
- "åı¦å¤ĸ": 4626,
- "Ġocean": 4627,
- "æĦŁåıĹ": 4628,
- "Ġpollution": 4629,
- "urb": 4630,
- "æī§è¡Į": 4631,
- "ersonal": 4632,
- "ĠHealth": 4633,
- "ä¼ĺçĤ¹": 4634,
- "Ġattention": 4635,
- "æľīçĿĢ": 4636,
- "é£ŁæĿIJ": 4637,
- "Ġerr": 4638,
- "çļĦæĿ¥": 4639,
- "çļĦçĪ": 4640,
- "èѦ": 4641,
- "è·Ł": 4642,
- "æĹħè¡Į": 4643,
- "èĴľ": 4644,
- "çļĦæĢĿ": 4645,
- "Ġchatbot": 4646,
- "çļĦéľĢæ±Ĥ": 4647,
- "çķ¥": 4648,
- "Ġfeeling": 4649,
- "Ġimplemented": 4650,
- "社åĮº": 4651,
- "çļĦ建议": 4652,
- "æIJħ": 4653,
- "éĹ»": 4654,
- "åıįé¦Ī": 4655,
- "缴æİ¥": 4656,
- "æĺ¥": 4657,
- "itable": 4658,
- "æĪijä¼ļ": 4659,
- "åį±": 4660,
- "èī¯å¥½": 4661,
- "Ġliving": 4662,
- "åıĺéĩı": 4663,
- "ĠBut": 4664,
- "Ġcomplete": 4665,
- "Ġtrends": 4666,
- "Ġmakes": 4667,
- "ä»Ĭ天": 4668,
- "Ġdistribut": 4669,
- "Ġcommit": 4670,
- "Ġatmosp": 4671,
- "ä¼´": 4672,
- "Ġsensors": 4673,
- "Ġsw": 4674,
- "æĹłè®º": 4675,
- "omen": 4676,
- "æĶ¿åºľ": 4677,
- "Ġchallenge": 4678,
- "Ġturn": 4679,
- "çIJĨ论": 4680,
- "par": 4681,
- "Ġwrite": 4682,
- "ç»ıåħ¸": 4683,
- "emember": 4684,
- "é¥Ń": 4685,
- "æĸ¹ä¾¿": 4686,
- "Ġcu": 4687,
- "Ġvalue": 4688,
- "Ġfund": 4689,
- "pose": 4690,
- "è°ĥæŁ¥": 4691,
- "çĿ¡": 4692,
- "Ġcommunicate": 4693,
- "Ġdisease": 4694,
- "Ġresearc": 4695,
- "Ġlack": 4696,
- "arning": 4697,
- "ĠPark": 4698,
- "çĦ¦": 4699,
- "é«ĺ度": 4700,
- "Ġrather": 4701,
- "宣": 4702,
- "çζ": 4703,
- "éĺ¶": 4704,
- "订": 4705,
- "çĥ§": 4706,
- "Ġhigher": 4707,
- "Ġsummary": 4708,
- "ĠAut": 4709,
- "çļĦæ³": 4710,
- "Ġele": 4711,
- "isms": 4712,
- "Ġreli": 4713,
- "ä¹Łä¼ļ": 4714,
- "fra": 4715,
- "åijĬè¯īæĪij": 4716,
- "æĬ½": 4717,
- "Ġsituations": 4718,
- "Ġmarine": 4719,
- "æĥ³è¦ģ": 4720,
- "inci": 4721,
- "inal": 4722,
- "Ġgain": 4723,
- "Ġdifference": 4724,
- "æľºåĻ¨äºº": 4725,
- "æµģç¨ĭ": 4726,
- "ĠChat": 4727,
- "ç½ijç«Ļ": 4728,
- "æľ«": 4729,
- "Ġcolor": 4730,
- "Ġaspect": 4731,
- "ç½Ĺ": 4732,
- "ĠEduc": 4733,
- "Ġdeploy": 4734,
- "Ġbeauty": 4735,
- "æĤ£": 4736,
- "ruction": 4737,
- "itut": 4738,
- "æĿŁ": 4739,
- "让æĪij们": 4740,
- "éķ¿åº¦": 4741,
- "ules": 4742,
- "æ¶īåıĬ": 4743,
- "Ġdigital": 4744,
- "Ġexisting": 4745,
- "ĠOr": 4746,
- "\\_\\_": 4747,
- "Ġbackground": 4748,
- "çĹĩ": 4749,
- "æ¯ı天": 4750,
- "python": 4751,
- "Ġfarmers": 4752,
- "Ġcontinu": 4753,
- "\":": 4754,
- "Ġgiven": 4755,
- "å°ıæĹ¶": 4756,
- "Ġmoment": 4757,
- "200": 4758,
- "John": 4759,
- "éĿ¢å¯¹": 4760,
- "Ġintro": 4761,
- "Ġtherapy": 4762,
- "è¿ĶåĽŀ": 4763,
- "å¹¶åľ¨": 4764,
- "Ġz": 4765,
- "Ġafford": 4766,
- "ä¸Ŀ": 4767,
- "宽": 4768,
- "ĠÃ": 4769,
- "ĠNational": 4770,
- "èĥ¡": 4771,
- "Ġexercise": 4772,
- "æIJħæĭĮ": 4773,
- "æĶ¯ä»ĺ": 4774,
- "éĺ³åħī": 4775,
- "è¯ļ": 4776,
- "Ġsect": 4777,
- "ĠSu": 4778,
- "å¢ŀéķ¿": 4779,
- "ç¾İ丽": 4780,
- "Ġwa": 4781,
- "以ä¸ĭæĺ¯ä¸ĢäºĽ": 4782,
- "èĽĭç³ķ": 4783,
- "Ġill": 4784,
- "æ¸ħæĻ": 4785,
- "etry": 4786,
- "梦": 4787,
- "ç¾İåĽ½": 4788,
- "ä»į": 4789,
- "oney": 4790,
- "Ġecosystems": 4791,
- "æĮĩ导": 4792,
- "def": 4793,
- "99": 4794,
- "æŁĶ": 4795,
- "pped": 4796,
- "Ġlimit": 4797,
- "çİī": 4798,
- "Ġacademic": 4799,
- "Ġrestaurants": 4800,
- "Ġhead": 4801,
- "ä¿¡ä»»": 4802,
- "asters": 4803,
- "å²ģ": 4804,
- "akers": 4805,
- "14": 4806,
- "As": 4807,
- "æł¡": 4808,
- "é«ĺæķĪ": 4809,
- "phas": 4810,
- "yn": 4811,
- "ç¨ĭ度": 4812,
- "è¾£": 4813,
- "ä¸ĬéĿ¢": 4814,
- "å®¶å±ħ": 4815,
- "term": 4816,
- "ç¾İé£Ł": 4817,
- "Ġovers": 4818,
- "å®ĺ": 4819,
- "Ġindic": 4820,
- "ĠYour": 4821,
- "St": 4822,
- "形象": 4823,
- "è´¡": 4824,
- "åºĬ": 4825,
- "ĠSc": 4826,
- "agra": 4827,
- "羣æŃ£": 4828,
- "oint": 4829,
- "ids": 4830,
- "arent": 4831,
- "éĵ¶": 4832,
- "èģĬ": 4833,
- "Ġregular": 4834,
- "ä¼ĺç§Ģ": 4835,
- "Ġcolle": 4836,
- "çĸij": 4837,
- "Ġsubject": 4838,
- "Ġgreater": 4839,
- "Ġstore": 4840,
- "åŁ¹è®Ń": 4841,
- "Ġimag": 4842,
- "Ġansw": 4843,
- "ä½Ļ": 4844,
- "Ġspot": 4845,
- "åĪĨåŃIJ": 4846,
- "Ġaudience": 4847,
- "pet": 4848,
- "Ġvers": 4849,
- "Ġtrail": 4850,
- "åĭĩ": 4851,
- "erous": 4852,
- "Ġguidance": 4853,
- "Ġspeech": 4854,
- "åĵ²": 4855,
- "æĺ¯çͱ": 4856,
- "è´¡çĮ®": 4857,
- "åIJĪéĢĤçļĦ": 4858,
- "设æĸ½": 4859,
- "ä»ĸ人": 4860,
- "ensive": 4861,
- "å̾": 4862,
- "aling": 4863,
- "Ġprojects": 4864,
- "å³": 4865,
- "Ġtakes": 4866,
- "绩": 4867,
- "That": 4868,
- "Ġbro": 4869,
- "ived": 4870,
- "Ġ&": 4871,
- "åĿIJ": 4872,
- "placement": 4873,
- "è¿ŀæİ¥": 4874,
- "çļĦ社": 4875,
- "ĠTra": 4876,
- "Ġrelax": 4877,
- "ufact": 4878,
- "éģį": 4879,
- "Ġsurv": 4880,
- "åı£åij³": 4881,
- "Ġcreativity": 4882,
- "of": 4883,
- "å¨ģ": 4884,
- "çļĦçł": 4885,
- "Ġbreath": 4886,
- "Ġplaces": 4887,
- "Ġdescrib": 4888,
- "èĭ±è¯Ń": 4889,
- "Ġdamage": 4890,
- "oration": 4891,
- "为æĤ¨": 4892,
- "ift": 4893,
- "Ġcase": 4894,
- "å¹´é¾Ħ": 4895,
- "Ġpress": 4896,
- "çĶľ": 4897,
- "éĩİ": 4898,
- "æĹħ游": 4899,
- "Ġtaken": 4900,
- "ined": 4901,
- "Ġconcept": 4902,
- "æĴŃ": 4903,
- "Ġinteresting": 4904,
- "è·µ": 4905,
- "Ġsea": 4906,
- "60": 4907,
- "Ġfoot": 4908,
- "ĠName": 4909,
- "Ġresearchers": 4910,
- "éĢģ": 4911,
- "Ġwee": 4912,
- ");": 4913,
- "çļĦåħ³éĶ®": 4914,
- "ä¼½": 4915,
- "elebr": 4916,
- "å¡ij": 4917,
- "We": 4918,
- "ç»ı常": 4919,
- "Ġpopulations": 4920,
- "åħ¬å¼ı": 4921,
- "orn": 4922,
- "çĩĥ": 4923,
- "人çĶŁ": 4924,
- "17": 4925,
- "æİ¥åıĹ": 4926,
- "Ġlocation": 4927,
- "Ġinequ": 4928,
- "Ġintervent": 4929,
- "Ġinterested": 4930,
- "Ġdefinitely": 4931,
- "Ġassistance": 4932,
- "è¿Ļä¸Ģ": 4933,
- "åIJĪåIJĮ": 4934,
- "ä¼ĺåĬ¿": 4935,
- "çļĦå·¥ä½ľ": 4936,
- "Ġ12": 4937,
- "Ġmov": 4938,
- "åģı": 4939,
- "åŃĺåĤ¨": 4940,
- "usive": 4941,
- "æĹı": 4942,
- "ï¼īï¼Į": 4943,
- "Ġgas": 4944,
- "Ġinterests": 4945,
- "æ¸ħæĻ°": 4946,
- "Ġgard": 4947,
- "çĸ«": 4948,
- "Ġsay": 4949,
- "夫": 4950,
- "ges": 4951,
- "èIJ¨": 4952,
- "ä¸ļåĬ¡": 4953,
- "个æĢ§": 4954,
- "åIJ¯": 4955,
- "Ġengagement": 4956,
- "Ġbig": 4957,
- "éľĢè¦ģèĢĥèĻij": 4958,
- "Ġprinci": 4959,
- "åij¨åĽ´": 4960,
- "Ġopportunity": 4961,
- "çģ¾": 4962,
- "èĹı": 4963,
- "rel": 4964,
- "缺çĤ¹": 4965,
- "Ġhappy": 4966,
- "åĴĮåħ¶ä»ĸ": 4967,
- "ava": 4968,
- "Ġestablish": 4969,
- "鸡èĽĭ": 4970,
- "iking": 4971,
- "ĠTrans": 4972,
- "rastructure": 4973,
- "forest": 4974,
- "èİ·åıĸ": 4975,
- "èĦļ": 4976,
- "inally": 4977,
- "èµı": 4978,
- "Ġdelicious": 4979,
- "Ġresults": 4980,
- "è§Ĥå¯Ł": 4981,
- "å®ŀè·µ": 4982,
- "Ġlast": 4983,
- "Ġpolit": 4984,
- "æĢ§èĥ½": 4985,
- "For": 4986,
- "bi": 4987,
- "çĽ¸ä¿¡": 4988,
- "ffee": 4989,
- "Ġphr": 4990,
- "Ġforest": 4991,
- "elling": 4992,
- "æµģè¡Į": 4993,
- "atic": 4994,
- "大家": 4995,
- "ĠInst": 4996,
- "æķ°åѦ": 4997,
- "æī©": 4998,
- "å®Įåħ¨": 4999,
- "å¼ķèµ·": 5000,
- "ese": 5001,
- "转æį¢": 5002,
- "Ġaffected": 5003,
- "Ġrobotics": 5004,
- "综ä¸Ĭ": 5005,
- "Ġprop": 5006,
- "让人": 5007,
- "æ²³": 5008,
- "ä¸ŃæľĢ": 5009,
- "Ġautonomous": 5010,
- "Ġhaving": 5011,
- "Ġtrip": 5012,
- "ury": 5013,
- "Ġbiased": 5014,
- "Ġconsiderations": 5015,
- "Ġparticular": 5016,
- "åįł": 5017,
- "æİ¨å¹¿": 5018,
- "Ġinitiatives": 5019,
- "ials": 5020,
- "åij³éģĵ": 5021,
- "Ġtreatments": 5022,
- "Ġemphas": 5023,
- "çĭ¬çī¹çļĦ": 5024,
- "Ġlay": 5025,
- "æĶ¿çŃĸ": 5026,
- "æĢİä¹Ī": 5027,
- "ronic": 5028,
- "play": 5029,
- "Ġcook": 5030,
- "è¿Ľåħ¥": 5031,
- "è½®": 5032,
- "Ġvolunte": 5033,
- "Ġrain": 5034,
- "ĠMon": 5035,
- "Ġconsumption": 5036,
- "èĽĭçϽ": 5037,
- "ĠSoc": 5038,
- "壤": 5039,
- "Ġroutine": 5040,
- "Ġimproved": 5041,
- "To": 5042,
- "人çī©": 5043,
- "读èĢħ": 5044,
- "Ġgoal": 5045,
- "广åijĬ": 5046,
- "éķ¿æľŁ": 5047,
- "Ġey": 5048,
- "He": 5049,
- "Ġoutdo": 5050,
- "Ġcuis": 5051,
- "Ġaway": 5052,
- "Ġbooks": 5053,
- "Ġtopic": 5054,
- "大åĪ©": 5055,
- "house": 5056,
- "Ġones": 5057,
- "ç§Ł": 5058,
- "':": 5059,
- "æĪ¿å±ĭ": 5060,
- "ç§»åĬ¨": 5061,
- "Ġdisasters": 5062,
- "ests": 5063,
- "illing": 5064,
- "绿èī²": 5065,
- "åĵ²åѦ": 5066,
- "æĪIJåĪĨ": 5067,
- "Ġoccur": 5068,
- "ľä¼½": 5069,
- "åľŁå£¤": 5070,
- "çļĦ主è¦ģ": 5071,
- "çݰå®ŀ": 5072,
- "Ġanimal": 5073,
- "é¢Ĩ导": 5074,
- "Ġviews": 5075,
- "éĤ®": 5076,
- "æ°§åĮĸ": 5077,
- "athy": 5078,
- "éģĵå¾·": 5079,
- "社交åªĴä½ĵ": 5080,
- "ĠPersonal": 5081,
- "ĽåĽ´": 5082,
- "Ġpurch": 5083,
- "Ġcountry": 5084,
- "Ġremind": 5085,
- "寸": 5086,
- "Ġrights": 5087,
- "çļĦçݯå¢ĥ": 5088,
- "ĠPr": 5089,
- "Ġline": 5090,
- "ibr": 5091,
- "驾": 5092,
- "Ġmaj": 5093,
- "Ġovercome": 5094,
- "Ġnext": 5095,
- "æīĢè¿°": 5096,
- "è§Ħå®ļ": 5097,
- "Ġinteractions": 5098,
- "Ġconflic": 5099,
- "Ġwhy": 5100,
- "ç³»åĪĹ": 5101,
- "å°¼": 5102,
- "ibly": 5103,
- "çīĽå¥¶": 5104,
- "Ġresponses": 5105,
- "ses": 5106,
- "åѦä¼ļ": 5107,
- "bol": 5108,
- "Ġstandards": 5109,
- "ulner": 5110,
- "对è¯ĿåĨħ容": 5111,
- "lished": 5112,
- "çļĦæĢ§": 5113,
- "çĶŁæĢģç³»ç»Ł": 5114,
- "ann": 5115,
- "æĥħåĨµä¸ĭ": 5116,
- "寻æ±Ĥ": 5117,
- "Ġhold": 5118,
- "den": 5119,
- "åįĥ": 5120,
- "Ġmention": 5121,
- "ĠMany": 5122,
- "缴åΰ": 5123,
- "éģĹ": 5124,
- "hel": 5125,
- "Ġbelieve": 5126,
- "aries": 5127,
- "æľīä¸Ģ个": 5128,
- "13": 5129,
- "Ġatmosphere": 5130,
- "Ġmor": 5131,
- "æĹ¥æľŁ": 5132,
- "ä¹ħ": 5133,
- "ä½łå¥½": 5134,
- "Ġaddressing": 5135,
- "ĠâĢĵ": 5136,
- "çļĦåľ°æĸ¹": 5137,
- "ming": 5138,
- "Ġcannot": 5139,
- "Ġmanufact": 5140,
- "Ġpie": 5141,
- "icing": 5142,
- "Ġstudies": 5143,
- "ç¾İåij³": 5144,
- "ĠAmerican": 5145,
- "ĠNLP": 5146,
- "Ġaccording": 5147,
- "mselves": 5148,
- "èĦĤ": 5149,
- "èĩªä¿¡": 5150,
- "æīĢéľĢ": 5151,
- "Ġthemselves": 5152,
- "Ġremote": 5153,
- "åŁ¹åħ»": 5154,
- "å®īæİĴ": 5155,
- "ä½łéľĢè¦ģ": 5156,
- "Ġregard": 5157,
- "iring": 5158,
- "è¯ĨåĪ«": 5159,
- "Ġarticle": 5160,
- "æģĴ": 5161,
- "æĢ»çļĦæĿ¥": 5162,
- "Ġalign": 5163,
- "æ±ł": 5164,
- "tenance": 5165,
- "faction": 5166,
- "åĬ¨ä½ľ": 5167,
- "çļĦç©": 5168,
- "缩": 5169,
- "æĢ¥": 5170,
- "Ġ100": 5171,
- "Ġtesting": 5172,
- "åŃĹæ¯į": 5173,
- "å¹´è½»": 5174,
- "åζéĢł": 5175,
- "Ġswe": 5176,
- "å°º": 5177,
- "hens": 5178,
- "æ°´æŀľ": 5179,
- "Ġinfrastructure": 5180,
- "èī²å½©": 5181,
- "æĢ»çļĦæĿ¥è¯´": 5182,
- "æľīä»Ģä¹Ī": 5183,
- "text": 5184,
- "车è¾Ĩ": 5185,
- "Ġpay": 5186,
- "rop": 5187,
- "ĊĠĠ": 5188,
- "Ġcaused": 5189,
- "Ġcorrect": 5190,
- "Ġì": 5191,
- "èĥŀ": 5192,
- "ĠMed": 5193,
- "ç²¾ç¥ŀ": 5194,
- "æ°ĶåĢĻåıĺåĮĸ": 5195,
- "ĠRed": 5196,
- "äºĴèģĶç½ij": 5197,
- "Ġengage": 5198,
- "åĪĨ为": 5199,
- "ĠData": 5200,
- "Ġfull": 5201,
- "enc": 5202,
- "éĩįæĸ°": 5203,
- "æŃ£ç¡®çļĦ": 5204,
- "çļĦæ°Ķ": 5205,
- "åıĮæĸ¹": 5206,
- "Ġcomes": 5207,
- "åı¤ä»£": 5208,
- "æŁIJäºĽ": 5209,
- "åijĪçݰ": 5210,
- "Ġtoday": 5211,
- "aged": 5212,
- "æĪijåı¯ä»¥": 5213,
- "æĹ¥å¸¸": 5214,
- "æ»ij": 5215,
- "Ġclin": 5216,
- "Ġ\\": 5217,
- "Ġobs": 5218,
- "Ġartificial": 5219,
- "Ġexcell": 5220,
- "çļĦç¬": 5221,
- "alls": 5222,
- "Ġproduce": 5223,
- "ĠDes": 5224,
- "oss": 5225,
- "è¹Ī": 5226,
- "Ġdraw": 5227,
- "Ġletter": 5228,
- "Ġadvice": 5229,
- "Ġhighly": 5230,
- "çĬ¯": 5231,
- "综ä¸ĬæīĢè¿°": 5232,
- "满æĦı": 5233,
- "Ġprinciples": 5234,
- "èĮĦ": 5235,
- "Ġfeelings": 5236,
- "çļĦæ´": 5237,
- "Ġhom": 5238,
- "Ġfail": 5239,
- "Ġcrop": 5240,
- "å§ľ": 5241,
- "Ġquestion": 5242,
- "Ġdisabilities": 5243,
- "èĪŀè¹Ī": 5244,
- "Ġimplications": 5245,
- "ral": 5246,
- "Ġsing": 5247,
- "40": 5248,
- "Ġfamil": 5249,
- "Ġgovernments": 5250,
- "Ġrecord": 5251,
- "å½¢çĬ¶": 5252,
- "Ġbegin": 5253,
- "ises": 5254,
- "çļĦæĥ³": 5255,
- "achine": 5256,
- "è°±": 5257,
- "Ġvulner": 5258,
- "Ġproper": 5259,
- "Ġoversight": 5260,
- "è´ŁéĿ¢": 5261,
- "Ġemail": 5262,
- "Ġnews": 5263,
- "Ġexploring": 5264,
- "Ġfavor": 5265,
- "楼": 5266,
- "å®ľ": 5267,
- "Ġunivers": 5268,
- "å·®å¼Ĥ": 5269,
- "ï¼īãĢĤ": 5270,
- "è§£åĨ³éĹ®é¢ĺ": 5271,
- "Ġfamous": 5272,
- "gn": 5273,
- "Ġmessage": 5274,
- "atitude": 5275,
- "Ġcra": 5276,
- "Ġcover": 5277,
- "æ·±åĪ»": 5278,
- "åı¯ä»¥éĢīæĭ©": 5279,
- "çĶŁæ´»ä¸Ń": 5280,
- "ç§įç±»": 5281,
- "Ġsmart": 5282,
- "onstr": 5283,
- "vey": 5284,
- "çͲ": 5285,
- "Ġregularly": 5286,
- "ĠSm": 5287,
- "æĦŁè§ī": 5288,
- "Ġthought": 5289,
- "Ġexh": 5290,
- "cure": 5291,
- "ç»ĺ": 5292,
- "认è¯Ĩ": 5293,
- "Ġold": 5294,
- "æĦī": 5295,
- "称为": 5296,
- "Ġfields": 5297,
- "Ġconsist": 5298,
- "ãģ": 5299,
- "ç»Ĩèĥŀ": 5300,
- "Ġhours": 5301,
- "80": 5302,
- "alking": 5303,
- "è§īå¾Ĺ": 5304,
- "ç»Ŀ": 5305,
- "ä½łä»¬": 5306,
- "ĠEnglish": 5307,
- "Ġsignificantly": 5308,
- "Ġsource": 5309,
- "Ġant": 5310,
- "Ġeducational": 5311,
- "Ġtask": 5312,
- "Ġhandle": 5313,
- "æIJľ": 5314,
- "ĠSp": 5315,
- "Ġcalled": 5316,
- "Ġterms": 5317,
- "æ²ī": 5318,
- "Ġwin": 5319,
- "duction": 5320,
- "Ġmodern": 5321,
- "Ġcuisine": 5322,
- "å¥Ĺ": 5323,
- "触": 5324,
- "olutely": 5325,
- "ç«¥": 5326,
- "pite": 5327,
- "Ġfelt": 5328,
- "Ġcompre": 5329,
- "Ġwond": 5330,
- "è¿IJè¡Į": 5331,
- "Ġresil": 5332,
- "çĽ¸ä¼¼": 5333,
- "éĩijèŀį": 5334,
- "çαæĥħ": 5335,
- "ç¬Ķ": 5336,
- "èĪª": 5337,
- "è°Ī": 5338,
- "åĬĽçļĦ": 5339,
- "æľīæīĢ": 5340,
- "æ½ľ": 5341,
- "ulate": 5342,
- "Ġdetection": 5343,
- "å®£ä¼ł": 5344,
- "Ġmatter": 5345,
- "éĩıåŃIJ": 5346,
- "Write": 5347,
- "ç»ĵåIJĪ": 5348,
- "ç»ıè¿ĩ": 5349,
- "Ġdevelopers": 5350,
- "èª": 5351,
- "Ġ---": 5352,
- "人éĻħ": 5353,
- "çѾ": 5354,
- "ï¼ļâĢľ": 5355,
- "Ġinnovative": 5356,
- "ãĢĤâĢĿ": 5357,
- "å½¼": 5358,
- "饼": 5359,
- "è¿ĩ度": 5360,
- "Ġplanet": 5361,
- "åħ°": 5362,
- "å¸ģ": 5363,
- "æķ¬": 5364,
- "Ġlegal": 5365,
- "Ġlot": 5366,
- "æĪIJ为äºĨ": 5367,
- "iate": 5368,
- "Ġmis": 5369,
- "åģĩ设": 5370,
- "çļĦæĸĩ竳": 5371,
- "ĠCompan": 5372,
- "Ġdoc": 5373,
- "Ġcareful": 5374,
- "Ġever": 5375,
- "æĪij们å°Ĩ": 5376,
- "ä¾ĭåŃIJ": 5377,
- "ä¹³": 5378,
- "ä½ľèĢħ": 5379,
- "åIJ§": 5380,
- "æļ´": 5381,
- "Ġremember": 5382,
- "缮çļĦ": 5383,
- "Ġput": 5384,
- "常è§ģçļĦ": 5385,
- "Ġfest": 5386,
- "建设": 5387,
- "å®ŀç͍": 5388,
- "Ġactive": 5389,
- "çªĹ": 5390,
- "outh": 5391,
- "åİŁçIJĨ": 5392,
- "Ġtrying": 5393,
- "è¿·": 5394,
- "缸åIJĮ": 5395,
- "éħĴåºĹ": 5396,
- "Another": 5397,
- "æľĢä½³": 5398,
- "Ġanalytics": 5399,
- "Ġperpet": 5400,
- "ipment": 5401,
- "Ġå¦Ĥæŀľ": 5402,
- "è§Ĥä¼Ĺ": 5403,
- "Ġcelebr": 5404,
- "Ġheav": 5405,
- "Ġmeditation": 5406,
- "大æ°Ķ": 5407,
- "And": 5408,
- "ä¸įéĶĻ": 5409,
- "Ġwhether": 5410,
- "set": 5411,
- "Ġdemonstr": 5412,
- "ä¸Ģ款": 5413,
- "æĶ¶éĽĨ": 5414,
- "éĻIJåζ": 5415,
- "Ġing": 5416,
- "Ġrevolution": 5417,
- "çľģ": 5418,
- "Ġscience": 5419,
- "缮åīį": 5420,
- "Ġthinking": 5421,
- "±ä¹IJ": 5422,
- "课ç¨ĭ": 5423,
- "Ġpack": 5424,
- "Ġimage": 5425,
- "loc": 5426,
- "Ġstories": 5427,
- "uck": 5428,
- "Ġsatisfaction": 5429,
- "Ġcollection": 5430,
- "ho": 5431,
- "èµŀ": 5432,
- "éĿ¢ä¸´": 5433,
- "Ġla": 5434,
- "Ġsymbol": 5435,
- "Ġemb": 5436,
- "Ġhabitats": 5437,
- "Ġlower": 5438,
- "Ġcontinues": 5439,
- "éľĩ": 5440,
- "åĵĪ": 5441,
- "ĠTake": 5442,
- "Ġenvironments": 5443,
- "Ġthree": 5444,
- "Ġenc": 5445,
- "ĠAcc": 5446,
- "æĦıåij³": 5447,
- "åݨ": 5448,
- "chan": 5449,
- "ĠHum": 5450,
- "Ġtrue": 5451,
- "åĪĩæĪIJ": 5452,
- "sing": 5453,
- "âĢĶâĢĶ": 5454,
- "åĩºæĿ¥": 5455,
- "Ġregion": 5456,
- "Ġinterpre": 5457,
- "Ġdiagnosis": 5458,
- "éŀ": 5459,
- "Ġdoing": 5460,
- "Ġrun": 5461,
- "Ġcoffee": 5462,
- "Ġmajor": 5463,
- "Ġmindfulness": 5464,
- "Ġaffordable": 5465,
- "çϾ": 5466,
- "Ġdetailed": 5467,
- "éĿŀ常éĩįè¦ģçļĦ": 5468,
- "çļĦæ²ŁéĢļ": 5469,
- "çļĦæķħ": 5470,
- "åĢĴåħ¥": 5471,
- "Ġthemes": 5472,
- "Ġnetwork": 5473,
- "ï¼īï¼ļ": 5474,
- "ĠUnited": 5475,
- "çļĦæĮĩ": 5476,
- "orts": 5477,
- "åį«çĶŁ": 5478,
- "Ġplanning": 5479,
- "æĥł": 5480,
- "åīª": 5481,
- "ĠProv": 5482,
- "çļĦåºĶç͍": 5483,
- "Ġperi": 5484,
- "Ġaccountable": 5485,
- "çīĻ": 5486,
- "çļĦçģ": 5487,
- "Ġchoice": 5488,
- "ĠComm": 5489,
- "idents": 5490,
- "çļĦå®īåħ¨": 5491,
- "å¹¶ä¸į": 5492,
- "太éĺ³ç³»": 5493,
- "Ġreceive": 5494,
- "Ġclose": 5495,
- "çļĦæĹ¶åĢĻ": 5496,
- "Ġchanging": 5497,
- "ä»·å̼è§Ĥ": 5498,
- "Ġperpetu": 5499,
- "Ġseason": 5500,
- "Ġmen": 5501,
- "Ġlearned": 5502,
- "Ġsituation": 5503,
- "Ġreplace": 5504,
- "head": 5505,
- "让æĪij": 5506,
- "åľ¨ä¸Ģèµ·": 5507,
- "çļĦ空": 5508,
- "éľ²": 5509,
- "Ġenough": 5510,
- "å±ķçݰ": 5511,
- "Ġleaders": 5512,
- "ancing": 5513,
- "Ġtemperature": 5514,
- "åı«": 5515,
- "Ġ30": 5516,
- "æĦıåij³çĿĢ": 5517,
- "æ±ĩ": 5518,
- "ĠGovern": 5519,
- "Ġfocused": 5520,
- "uro": 5521,
- "Ġsimple": 5522,
- "Ġhiking": 5523,
- "æ¯Ĵ": 5524,
- "Ġcomprehens": 5525,
- "äºĪ": 5526,
- "Ġcreated": 5527,
- "cond": 5528,
- "页": 5529,
- "ĠWor": 5530,
- "è¯ģæį®": 5531,
- "Ġworkplace": 5532,
- "Ġcharacters": 5533,
- "çļĦ设计": 5534,
- "Ġmechan": 5535,
- "ĠDis": 5536,
- "ç¥ŀç§ĺ": 5537,
- "å·ŀ": 5538,
- "ĠOn": 5539,
- "": 5540,
- "ç§įæ¤į": 5541,
- "Ġpath": 5542,
- "Ġlimited": 5543,
- "Ġsolar": 5544,
- "çļĦæı": 5545,
- "22": 5546,
- "Ġappreciate": 5547,
- "å¿«ä¹IJ": 5548,
- "æĦŁåıĹåΰ": 5549,
- "èĢĹ": 5550,
- "med": 5551,
- "icine": 5552,
- "Ġnote": 5553,
- "å½ĵåīį": 5554,
- "æĪij们åºĶ该": 5555,
- "Ġseen": 5556,
- "ä¸ĢåIJį": 5557,
- "å°½åı¯èĥ½": 5558,
- "è¿IJç®Ĺ": 5559,
- "è§Ĵ度": 5560,
- "Ġequipment": 5561,
- "Ġspread": 5562,
- "è¸": 5563,
- "访": 5564,
- "åı¥è¯Ŀ": 5565,
- "æĮ¥": 5566,
- "Ġpurpose": 5567,
- "è¯·ä½ł": 5568,
- "Your": 5569,
- "arian": 5570,
- "仪": 5571,
- "Ġperspectives": 5572,
- "åĩºäºĨ": 5573,
- "å©ļ礼": 5574,
- "Ġexcellent": 5575,
- "ĠEnsuring": 5576,
- "Ġreach": 5577,
- "éĺ¶æ®µ": 5578,
- "ä¿Ŀéļľ": 5579,
- "Ġempathy": 5580,
- "ĠMy": 5581,
- "çijľä¼½": 5582,
- "Ġver": 5583,
- "abel": 5584,
- "ĠPredict": 5585,
- "Ġmaintenance": 5586,
- "è¯Ħä»·": 5587,
- "Ġult": 5588,
- "åĴ¨": 5589,
- "ox": 5590,
- "åĴ¨è¯¢": 5591,
- "Ġshared": 5592,
- "ina": 5593,
- "list": 5594,
- "Ġoutdoor": 5595,
- "Ġthoughts": 5596,
- "inating": 5597,
- "éĴ±": 5598,
- "Ġframe": 5599,
- "éĺ¿": 5600,
- "åĪ©æ¶¦": 5601,
- "çļĦæİ¨": 5602,
- "åįļ": 5603,
- "Ġrecent": 5604,
- "Ġaltern": 5605,
- "ared": 5606,
- "==": 5607,
- "Ġroad": 5608,
- "äºĭ项": 5609,
- "ged": 5610,
- "ynt": 5611,
- "Ġspend": 5612,
- "罪": 5613,
- "åıĸå¾Ĺ": 5614,
- "é¹": 5615,
- "li": 5616,
- "æĹ¶æľŁ": 5617,
- "严éĩį": 5618,
- "å¿Ĩ": 5619,
- "å©´": 5620,
- "æİ¥ä¸ĭæĿ¥": 5621,
- "ĠEarth": 5622,
- "ĠChatbots": 5623,
- "Ġsetting": 5624,
- "ç¥Ŀ": 5625,
- "éĶĢåĶ®é¢Ŀ": 5626,
- "伦": 5627,
- "Ġreading": 5628,
- "æİ¢è®¨": 5629,
- "aign": 5630,
- "éŀĭ": 5631,
- "Ġyoung": 5632,
- "Ġcareer": 5633,
- "Ġteachers": 5634,
- "çļĦè´¨éĩı": 5635,
- "å±ŀäºİ": 5636,
- "Ġeasier": 5637,
- "Ġscientific": 5638,
- "ç¾İåħĥ": 5639,
- "Ġspir": 5640,
- "åĬ³": 5641,
- "çļĦæĶ¯": 5642,
- "rist": 5643,
- "èµĦ产": 5644,
- "çĶŁåŃĺ": 5645,
- "èĩ³å°ij": 5646,
- "å§¿": 5647,
- "Ġvideo": 5648,
- "Ġaim": 5649,
- "å®Ŀå®Ŀ": 5650,
- "çζæ¯į": 5651,
- "________________": 5652,
- "alities": 5653,
- "Ġbud": 5654,
- "Ġstreet": 5655,
- "Ġæĺ¯": 5656,
- "æĸ¹ç¨ĭ": 5657,
- "ä¸ĸ纪": 5658,
- "ches": 5659,
- "earch": 5660,
- "æĴ°": 5661,
- "Ġengine": 5662,
- "Ġdisplacement": 5663,
- "ĠRobots": 5664,
- "ervised": 5665,
- "é¡¶": 5666,
- "oud": 5667,
- "Ġwalk": 5668,
- "Ġemergency": 5669,
- "èģĺ": 5670,
- "nal": 5671,
- "Ġdatas": 5672,
- "åĢº": 5673,
- "åIJİçļĦ": 5674,
- "å¾Ī好": 5675,
- "Ġmyself": 5676,
- "çļĦæīĭ": 5677,
- "Ġusage": 5678,
- "Ġshown": 5679,
- "æ®Ĭ": 5680,
- "Ġtypically": 5681,
- "uly": 5682,
- "æĸ°éĹ»": 5683,
- "æĽ¿": 5684,
- "Ġorig": 5685,
- "è½»æĿ¾": 5686,
- "æĺ¾ç¤º": 5687,
- "Ġadopt": 5688,
- "èĤ¡ç¥¨": 5689,
- "Ġparent": 5690,
- "aps": 5691,
- "æĢĿæĥ³": 5692,
- "Ġmarketing": 5693,
- "èĻ«": 5694,
- "éĥ¨éŨ": 5695,
- "çļĦæķĪ": 5696,
- "Ġcomfortable": 5697,
- "åŃ¦ä¹łåĴĮ": 5698,
- "Ġforecast": 5699,
- "iction": 5700,
- "Ġgetting": 5701,
- "Ġtrees": 5702,
- "aving": 5703,
- "çļĦåŁºç¡Ģ": 5704,
- "ready": 5705,
- "æĸ°é²ľ": 5706,
- "going": 5707,
- "¹é¥": 5708,
- "Ġevidence": 5709,
- "¹é¥ª": 5710,
- "ç§ĭ": 5711,
- "æľīå¾Īå¤ļ": 5712,
- "éĿ¢è¯ķ": 5713,
- "éģĩåΰ": 5714,
- "ç»Ļå®ļ": 5715,
- "irc": 5716,
- "åı¯ä»¥æł¹æį®": 5717,
- "驾驶": 5718,
- "å·§åħĭ": 5719,
- "Ġstunning": 5720,
- "çļĦæ¦Ĥ": 5721,
- "æ¡Į": 5722,
- "ĠJohn": 5723,
- "ulation": 5724,
- "åıĤèĢĥ": 5725,
- "Ġflex": 5726,
- "çĦ¦èĻij": 5727,
- "ymakers": 5728,
- "Ġforms": 5729,
- "sh": 5730,
- "val": 5731,
- "ĠSo": 5732,
- "co": 5733,
- "æİ¨åĬ¨": 5734,
- "èħ¿": 5735,
- "ç®Ĭ": 5736,
- "Ġenab": 5737,
- "å°Ĩä¼ļ": 5738,
- "æĶ¯åĩº": 5739,
- "åĿļæĮģ": 5740,
- "红èī²": 5741,
- "Ġoption": 5742,
- "Ġstarted": 5743,
- "ration": 5744,
- "Ġpoetry": 5745,
- "Ġport": 5746,
- "gen": 5747,
- "èªī": 5748,
- "Ġdeliv": 5749,
- "çĶļ": 5750,
- "éĢ»": 5751,
- "éĢī项": 5752,
- "Ġground": 5753,
- "å½¼æŃ¤": 5754,
- "ana": 5755,
- "çļĦæĹ¥": 5756,
- "åľ¨çº¿": 5757,
- "Ġsecure": 5758,
- "Ġæł¹æį®": 5759,
- "饮æĸĻ": 5760,
- "Ġgratitude": 5761,
- "第ä¸ī": 5762,
- "Ġsong": 5763,
- "Ġpoints": 5764,
- "Ġalready": 5765,
- "çļĦçα": 5766,
- "ĠTechn": 5767,
- "Ġreality": 5768,
- "çıŃ": 5769,
- "Ġsince": 5770,
- "Ġpopulation": 5771,
- "yond": 5772,
- "bor": 5773,
- "ĠSocial": 5774,
- "æıIJåıĸ": 5775,
- "å·¥ç¨ĭ": 5776,
- "aff": 5777,
- "交æĺĵ": 5778,
- "Ġworth": 5779,
- "å¡«": 5780,
- "娱ä¹IJ": 5781,
- "Ġdog": 5782,
- "ĠArt": 5783,
- "硬": 5784,
- "æµ·æ´ĭ": 5785,
- "åĨĴ": 5786,
- "çīĪ": 5787,
- "Ġprogramming": 5788,
- "ĠAss": 5789,
- "ĠMachine": 5790,
- "å̼å¾Ĺ": 5791,
- "请è¾ĵåħ¥": 5792,
- "å£°éŁ³": 5793,
- "Ġexercises": 5794,
- "åħī线": 5795,
- "æ³ķåĴĮ": 5796,
- "Ġfeature": 5797,
- "eff": 5798,
- "è¿ĽæŃ¥": 5799,
- "女æĢ§": 5800,
- "Ġefficiently": 5801,
- "çļĦæĬĢæľ¯": 5802,
- "Ġgenetic": 5803,
- "令人": 5804,
- "è´¦": 5805,
- "çļĦ产åĵģ": 5806,
- "åİļ": 5807,
- "åĴĮæĸĩåĮĸ": 5808,
- "éĻĦ": 5809,
- "Ġmob": 5810,
- "综åIJĪ": 5811,
- "ters": 5812,
- "æľīä¸Ģ": 5813,
- "å¦Ĩ": 5814,
- "åįĪ": 5815,
- "Ġoutside": 5816,
- "Ġpropert": 5817,
- "éĤ®ä»¶": 5818,
- "主ä¹ī": 5819,
- "Ġpolicy": 5820,
- "èĩªèº«": 5821,
- "Ġnavigate": 5822,
- "Ġsty": 5823,
- "ç͵èĦij": 5824,
- "Ġabilities": 5825,
- "Ġfaced": 5826,
- "çļĦç¼": 5827,
- "çļĦå°ı": 5828,
- "èķ": 5829,
- "Ġtone": 5830,
- "igation": 5831,
- "åıĤæķ°": 5832,
- "èĽĭçĻ½è´¨": 5833,
- "ä½Ľ": 5834,
- "çĶļèĩ³": 5835,
- "Ġskin": 5836,
- "èĴ¸": 5837,
- "æĭĽ": 5838,
- "éŃĶ": 5839,
- "ashion": 5840,
- "Ġingred": 5841,
- "æĹĭ": 5842,
- "Ġcampaign": 5843,
- "Ġmount": 5844,
- "Ġconsid": 5845,
- "Ġmuse": 5846,
- "nter": 5847,
- "water": 5848,
- "ä¼ļè®®": 5849,
- "Ġprotection": 5850,
- "ä¿ĿéĻ©": 5851,
- "Ġcrops": 5852,
- "ogle": 5853,
- "éļıæĹ¶": 5854,
- "æļĹ": 5855,
- "ium": 5856,
- "ä¹ı": 5857,
- "Ġdiet": 5858,
- "lies": 5859,
- "ç͍æĿ¥": 5860,
- "ĠEncoura": 5861,
- "æĬĹ": 5862,
- "apan": 5863,
- "éĺ²æŃ¢": 5864,
- "Wow": 5865,
- "çļĦåŁºæľ¬": 5866,
- "å¹³æĸ¹": 5867,
- "Ġstep": 5868,
- "åı¯éĿł": 5869,
- "表æĺİ": 5870,
- "Ġpredictions": 5871,
- "Ġsympt": 5872,
- "Ġdiagnoses": 5873,
- "åħ¬åĽŃ": 5874,
- "Ġsupply": 5875,
- "Ġprevious": 5876,
- "ç»ĦåIJĪ": 5877,
- ".,": 5878,
- "çļĦè¿ĩç¨ĭ": 5879,
- "æķı": 5880,
- "su": 5881,
- "aris": 5882,
- "çķħ": 5883,
- "ocol": 5884,
- "æIJľç´¢": 5885,
- "itle": 5886,
- "éĨĴ": 5887,
- "顾客": 5888,
- "éĢ»è¾ij": 5889,
- "éĿŀ常éĩįè¦ģ": 5890,
- "ĠBi": 5891,
- "å·¦åı³": 5892,
- "amm": 5893,
- "Ġeverything": 5894,
- "æĺł": 5895,
- "Ġincred": 5896,
- "Ġpeace": 5897,
- "èľľ": 5898,
- "Ġmuseum": 5899,
- "çĭ¬ç«ĭ": 5900,
- "Ġcomprehensive": 5901,
- "Ġrates": 5902,
- "//": 5903,
- "Ġrad": 5904,
- "åĦ¿ç«¥": 5905,
- "çī¹èī²": 5906,
- "ĠPredictive": 5907,
- "å¼ķåĬĽ": 5908,
- "ler": 5909,
- "å°¤": 5910,
- "icro": 5911,
- "è¡¥": 5912,
- "Ġdetermine": 5913,
- "çļĦåĨħ容": 5914,
- "Ġcompl": 5915,
- "Ġgreenhouse": 5916,
- "èħIJ": 5917,
- "Ġhighlight": 5918,
- "Ġpartners": 5919,
- "Ġdoct": 5920,
- "çļĦ使ç͍": 5921,
- "æŃĮæĽ²": 5922,
- "æĮĩåįĹ": 5923,
- "ĠAf": 5924,
- "æľºæŀĦ": 5925,
- "éĢĢ": 5926,
- "Ġpoems": 5927,
- "å¿ĥåĴĮ": 5928,
- "Ġattend": 5929,
- "çļĦ游": 5930,
- "Ġside": 5931,
- "ales": 5932,
- "Ġmentioned": 5933,
- "ĠAbs": 5934,
- "Ġhistorical": 5935,
- "Ġleft": 5936,
- "以ä¸ĭåĩłä¸ª": 5937,
- "åıĹæ¬¢è¿İ": 5938,
- "èıľåĵģ": 5939,
- "Ġremain": 5940,
- "æĩ": 5941,
- "Ġtours": 5942,
- "łéģĵ": 5943,
- "Ġerrors": 5944,
- "æľºåζ": 5945,
- "æ¦": 5946,
- "æĤ£èĢħ": 5947,
- "more": 5948,
- "Ġexperts": 5949,
- "çļĦçłĶç©¶": 5950,
- "ç»ĵæĿŁ": 5951,
- "Ġwritten": 5952,
- "çłĶ": 5953,
- "Ġet": 5954,
- "input": 5955,
- "æ°Ķä½ĵ": 5956,
- "èļ": 5957,
- "æĥĬ": 5958,
- "Ġage": 5959,
- "éĩįå¤į": 5960,
- "å¼¹": 5961,
- "åѤ": 5962,
- "Ġsymptoms": 5963,
- "Ġbelief": 5964,
- "'d": 5965,
- "iol": 5966,
- "Ġ18": 5967,
- "åħħè¶³": 5968,
- "çıį": 5969,
- "forcement": 5970,
- "æĸĹ": 5971,
- "ªèĮĦ": 5972,
- "Ġ15": 5973,
- "ä¸Ģ个人": 5974,
- "Ġapplic": 5975,
- "è´¥": 5976,
- "ä½įäºİ": 5977,
- "éϤäºĨ": 5978,
- "=\"": 5979,
- "ä¸īè§Ĵ": 5980,
- "æĢĿç»´": 5981,
- "åį·": 5982,
- "Ġfru": 5983,
- "ĠCollabor": 5984,
- "Ġprim": 5985,
- "Ġrequired": 5986,
- "Ġwatch": 5987,
- "è°ĥåij³": 5988,
- "ç»ĵ论": 5989,
- "ony": 5990,
- "Ġguide": 5991,
- "Ġmax": 5992,
- "ĠCould": 5993,
- "Ġadvent": 5994,
- "ĠOverall": 5995,
- "çļĦæĬķ": 5996,
- "Ġexper": 5997,
- "åĺ": 5998,
- "icial": 5999,
- "oster": 6000,
- "çļĦé¢ľèī²": 6001,
- "Ġoperations": 6002,
- "éĥģ": 6003,
- "Ġmoney": 6004,
- "ley": 6005,
- "cling": 6006,
- "Ġoil": 6007,
- "çļ®èĤ¤": 6008,
- "Ġge": 6009,
- "Ġbat": 6010,
- "ĠPh": 6011,
- "Ġsche": 6012,
- "Ġelectric": 6013,
- "vest": 6014,
- "Ġchain": 6015,
- "Ġcapabilities": 6016,
- "ird": 6017,
- "è¯ģæĺİ": 6018,
- "æľĢ好": 6019,
- "ivil": 6020,
- "Ġdepending": 6021,
- "Ġsave": 6022,
- "Ġpractical": 6023,
- "Ġcultures": 6024,
- "缸åºĶçļĦ": 6025,
- "sy": 6026,
- "çļĦç²": 6027,
- "Ġbehind": 6028,
- "æĹ¶éĹ´åĴĮ": 6029,
- "å¹ħ": 6030,
- "ĠAg": 6031,
- "Ġeffectiveness": 6032,
- "Ad": 6033,
- "ĠOf": 6034,
- "Ġanything": 6035,
- "å·§åħĭåĬĽ": 6036,
- "Ġmist": 6037,
- "Ġlanguages": 6038,
- "ĠMake": 6039,
- "å«": 6040,
- "森": 6041,
- "ĠCont": 6042,
- "ĠAbsolutely": 6043,
- "Ġinvestment": 6044,
- "mat": 6045,
- "çļĦæķħäºĭ": 6046,
- "欧": 6047,
- "Ġspeed": 6048,
- "çļĦ温": 6049,
- "Ġcities": 6050,
- "åĨĻä½ľ": 6051,
- "Thanks": 6052,
- "Ġded": 6053,
- "åĪĨéħį": 6054,
- "Ġdark": 6055,
- "Ġsupporting": 6056,
- "å¹ķ": 6057,
- "ĠKe": 6058,
- "鼶": 6059,
- "Ġsharing": 6060,
- "Ġhouse": 6061,
- "è®¤çŁ¥": 6062,
- "Ġsurrounding": 6063,
- "Ġreduced": 6064,
- "Ġfu": 6065,
- "Ġstor": 6066,
- "Ġabs": 6067,
- "Tom": 6068,
- "cent": 6069,
- "ĠEducation": 6070,
- "Ġthr": 6071,
- "ott": 6072,
- "ĠThat": 6073,
- "Ġhear": 6074,
- "ung": 6075,
- "Ġbeyond": 6076,
- "ĠCo": 6077,
- "room": 6078,
- "è¯ĹæŃĮ": 6079,
- "reme": 6080,
- "Ġlittle": 6081,
- "Ġgames": 6082,
- "ä¹ĭåIJİ": 6083,
- "éĥ½ä¼ļ": 6084,
- "è¯ŃéŁ³": 6085,
- "ç¬ij": 6086,
- "çī¹å®ļ": 6087,
- "第ä¸Ģ": 6088,
- "Ġdepression": 6089,
- "Ġinnovation": 6090,
- "ĠFr": 6091,
- "Ġcomputer": 6092,
- "can": 6093,
- "å³°": 6094,
- "ç¼ĸåĨĻä¸Ģ个": 6095,
- "Ġinternational": 6096,
- "Ġcancer": 6097,
- "åѦèĢħ": 6098,
- "Ġdiscover": 6099,
- "het": 6100,
- "Ġcompos": 6101,
- "Ġrecy": 6102,
- "Ġ200": 6103,
- "åIJ«æľī": 6104,
- "çĹĽ": 6105,
- "ç¼ĵè§£": 6106,
- "Ġfrequ": 6107,
- "çͳ": 6108,
- "ĠMar": 6109,
- "çļĦéĢīæĭ©": 6110,
- "Ġunt": 6111,
- "Ġregions": 6112,
- "Ġopin": 6113,
- "ĠGovernments": 6114,
- "æ¶Ĥ": 6115,
- "åĨħå¿ĥ": 6116,
- "ä¸ĬæľĢ": 6117,
- "ä»įçĦ¶": 6118,
- "lier": 6119,
- "æ³³": 6120,
- "äºĴ缸": 6121,
- "ĠStud": 6122,
- "azon": 6123,
- "Ġarch": 6124,
- "Ġchem": 6125,
- "çļĦèĥ½åĬĽ": 6126,
- "çļĦä¸Ģ个": 6127,
- "Ġap": 6128,
- "Ġred": 6129,
- "Ġwomen": 6130,
- "Ġprote": 6131,
- "Ġfinding": 6132,
- "å§»": 6133,
- "éĢĤå½ĵçļĦ": 6134,
- "Ġforward": 6135,
- "对象": 6136,
- "Ġwait": 6137,
- "Ġconsidered": 6138,
- "dule": 6139,
- "backs": 6140,
- "Ġclinical": 6141,
- "åħ·å¤ĩ": 6142,
- "麦": 6143,
- "Ġongoing": 6144,
- "åĨĽ": 6145,
- "Ġfar": 6146,
- "åĴĮè°": 6147,
- "XXX": 6148,
- "Ġpolitical": 6149,
- "Ġcamer": 6150,
- "çļĦè¡Į为": 6151,
- "æĦı大åĪ©": 6152,
- "Ġapps": 6153,
- "åĩıè½»": 6154,
- "Ġreaders": 6155,
- "å©ļå§»": 6156,
- "æ°¸": 6157,
- "ores": 6158,
- "åħ¨éĿ¢": 6159,
- "ĠAfric": 6160,
- "Ġfavorite": 6161,
- "Ġmill": 6162,
- "Ġdang": 6163,
- "ĠStates": 6164,
- "åĢŁ": 6165,
- "寿": 6166,
- "Ġlat": 6167,
- "è¿ĩåİ»": 6168,
- "Ġtruly": 6169,
- "åĽŀçŃĶéĹ®é¢ĺ": 6170,
- "Ġcogn": 6171,
- "ä»°": 6172,
- "ĠJapan": 6173,
- "izz": 6174,
- "çļĦæĿIJ": 6175,
- "xx": 6176,
- "é¢ĺ缮": 6177,
- "ription": 6178,
- "éĤ£äºĽ": 6179,
- "Ġbudget": 6180,
- "Ġvast": 6181,
- "éļIJç§ģ": 6182,
- "Ġpolicymakers": 6183,
- "è¿ĺéľĢè¦ģ": 6184,
- "å¹¶æıIJä¾Ľ": 6185,
- "Ġsweet": 6186,
- "Ġgeneral": 6187,
- "滤": 6188,
- "Ġbirds": 6189,
- "Ġplastic": 6190,
- "Ċĉ": 6191,
- "åĪº": 6192,
- "mental": 6193,
- "Ġinclusive": 6194,
- "Ġtopics": 6195,
- "Ġslow": 6196,
- "ä½łèĥ½": 6197,
- "è¶³å¤ŁçļĦ": 6198,
- "è§Ĩè§ī": 6199,
- "ww": 6200,
- "Ġ使ç͍": 6201,
- "æī¹": 6202,
- "æ¦Ĥ念": 6203,
- "é£Łç͍": 6204,
- "è̳": 6205,
- "cks": 6206,
- "Ġfraud": 6207,
- "Ġingredients": 6208,
- "Ġfasc": 6209,
- "åĮĹ京": 6210,
- "Ġfr": 6211,
- "Ġmanufacturing": 6212,
- "Ġä½ľä¸º": 6213,
- "Ġbeach": 6214,
- "é¡¿": 6215,
- "erious": 6216,
- "å¤ĸè§Ĥ": 6217,
- "é¢Ħéĺ²": 6218,
- "æĿ¥èĩª": 6219,
- "èĤĮèĤī": 6220,
- "Ġdays": 6221,
- "Ġassign": 6222,
- "Ġadvant": 6223,
- "Ġteams": 6224,
- "é¢Ĺ": 6225,
- "nown": 6226,
- "ĠPo": 6227,
- "}{": 6228,
- "Ġminut": 6229,
- "itions": 6230,
- "Ġeasily": 6231,
- "ĠBl": 6232,
- "name": 6233,
- "åŃ¦æł¡": 6234,
- "Ġresponsibility": 6235,
- "åıijæĮ¥": 6236,
- "Ġsensitive": 6237,
- "çŃīäºİ": 6238,
- "cious": 6239,
- "Ġsou": 6240,
- "å±ı": 6241,
- "Ġrich": 6242,
- "å½ĵçĦ¶": 6243,
- "man": 6244,
- "Ġinterpret": 6245,
- "24": 6246,
- "Ġshows": 6247,
- "èģĮåľº": 6248,
- "Ġfall": 6249,
- "è½½": 6250,
- "丰å¯ĮçļĦ": 6251,
- "('": 6252,
- "ä¿®æĶ¹": 6253,
- "æĽ´æį¢": 6254,
- "Al": 6255,
- "åı¯èĥ½æĺ¯": 6256,
- "Ġrate": 6257,
- "Ġprotecting": 6258,
- "fit": 6259,
- "Ġ50": 6260,
- "Ġmovement": 6261,
- "è§Ī": 6262,
- "Ġemployee": 6263,
- "Ġdisord": 6264,
- "åĪĽæĦı": 6265,
- "产åĵģçļĦ": 6266,
- "æľĿ": 6267,
- "ĊĠĠĠĠĠĠĠĠĠĠĠĠĠĠĠ": 6268,
- "Ġpred": 6269,
- "Ġoffering": 6270,
- "åįģåĪĨ": 6271,
- "èĢĮä¸įæĺ¯": 6272,
- "Thank": 6273,
- "æĽ¾": 6274,
- "Ġelements": 6275,
- "ç²Ĵ": 6276,
- "Ġcourses": 6277,
- "Ġintegrated": 6278,
- "ĠCar": 6279,
- "agraph": 6280,
- "åŁºåĽł": 6281,
- "Ġinstead": 6282,
- "èĦ±": 6283,
- "åı¦ä¸Ģ个": 6284,
- "å¯Ĩçłģ": 6285,
- "Ġallowed": 6286,
- "éĿ¢åĮħ": 6287,
- "çķªèĮĦ": 6288,
- "åĴĮåıijå±ķ": 6289,
- "å°ģ": 6290,
- "Ġconnection": 6291,
- "åľ¨ä¸Ģ个": 6292,
- "Ġuseful": 6293,
- "è¯Ńåı¥": 6294,
- "åĪĨå¸ĥ": 6295,
- "表æ¼Ķ": 6296,
- "æľīæĹ¶": 6297,
- "çļĦæĹħ": 6298,
- "çļĦæĢ»": 6299,
- "Ġfashion": 6300,
- "èĭ¦": 6301,
- "è¦ģ注æĦı": 6302,
- "çĶŁç´ł": 6303,
- "Ġnutri": 6304,
- "èĩªè¡Į": 6305,
- "çļĦçĭ": 6306,
- "çIJĨè§£åĴĮ": 6307,
- "Ġcat": 6308,
- "æľºåύåŃ¦ä¹ł": 6309,
- "Ġexhib": 6310,
- "åĴĮæľįåĬ¡": 6311,
- "frac": 6312,
- "epend": 6313,
- "Ġimpacted": 6314,
- "Ġut": 6315,
- "æķ°ç»Ħ": 6316,
- "ĠWorld": 6317,
- "Ġanswer": 6318,
- "erse": 6319,
- "骨": 6320,
- "Ġartists": 6321,
- "åŃ©åŃIJçļĦ": 6322,
- "ä»Ķ": 6323,
- "çĻ»": 6324,
- "ĠAre": 6325,
- "Ġcool": 6326,
- "Ġcognitive": 6327,
- "åIJĦ个": 6328,
- "like": 6329,
- "å©´åĦ¿": 6330,
- "åĪĹåĩº": 6331,
- "å¹»": 6332,
- "ront": 6333,
- "å®¶éķ¿": 6334,
- "缺ä¹ı": 6335,
- "Ġcyber": 6336,
- "ilt": 6337,
- "Ġcapture": 6338,
- "åĹ": 6339,
- "åľ¨äºİ": 6340,
- "Ġthreats": 6341,
- "åĴĮ社ä¼ļ": 6342,
- "Ġcells": 6343,
- "æ¸ħåįķ": 6344,
- "ĠVis": 6345,
- "æİī": 6346,
- "Ġhol": 6347,
- "åŃIJçļĦ": 6348,
- "Ch": 6349,
- "èĿ": 6350,
- "Ġsaid": 6351,
- "Ġdream": 6352,
- "unch": 6353,
- "une": 6354,
- "ĠDon": 6355,
- "家人": 6356,
- "ç±į": 6357,
- "æĦŁåĴĮ": 6358,
- "Ġexperienced": 6359,
- "çļĦéĩįè¦ģæĢ§": 6360,
- "å¼ĥ": 6361,
- "ump": 6362,
- "éĺIJ": 6363,
- "Ġhabitat": 6364,
- "è¢ĭ": 6365,
- "Ġjo": 6366,
- "ç®Ģæ´ģ": 6367,
- "Ġbur": 6368,
- "Ġvisitors": 6369,
- "éĽħ": 6370,
- "çļĦçŁ¥": 6371,
- "Ġentire": 6372,
- "讲述": 6373,
- "äºĨä¸ĢäºĽ": 6374,
- "åįıä½ľ": 6375,
- "ĠBus": 6376,
- "å°¾": 6377,
- "çļĦæķĻ": 6378,
- "olog": 6379,
- "Ġsigns": 6380,
- "Ġspeaker": 6381,
- "çļĦéŁ³ä¹IJ": 6382,
- "Ġnovel": 6383,
- "å±ħæ°ij": 6384,
- "çļĦåıĺåĮĸ": 6385,
- "å°½éĩı": 6386,
- "Ġspirit": 6387,
- "å®Įç¾İ": 6388,
- "è´·": 6389,
- "å¿ħè¦ģçļĦ": 6390,
- "ief": 6391,
- "示ä¾ĭ": 6392,
- "Ġdiv": 6393,
- "æķ´æķ°": 6394,
- "Ġeconomy": 6395,
- "Ġethically": 6396,
- "éĻĪ": 6397,
- "Ġschools": 6398,
- "Ġnetworks": 6399
- },
- "merges": [
- [
- "Ġ",
- "t"
- ],
- [
- "Ġ",
- "a"
- ],
- [
- "i",
- "n"
- ],
- [
- "h",
- "e"
- ],
- [
- "r",
- "e"
- ],
- [
- "ï",
- "¼"
- ],
- [
- "ä",
- "¸"
- ],
- [
- "o",
- "n"
- ],
- [
- "a",
- "t"
- ],
- [
- "ç",
- "ļ"
- ],
- [
- "çļ",
- "Ħ"
- ],
- [
- "ï¼",
- "Į"
- ],
- [
- "Ġ",
- "s"
- ],
- [
- "Ġ",
- "c"
- ],
- [
- "n",
- "d"
- ],
- [
- "ã",
- "Ģ"
- ],
- [
- "e",
- "r"
- ],
- [
- "Ġt",
- "he"
- ],
- [
- "e",
- "s"
- ],
- [
- "e",
- "n"
- ],
- [
- "o",
- "r"
- ],
- [
- "a",
- "n"
- ],
- [
- "Ġa",
- "nd"
- ],
- [
- "in",
- "g"
- ],
- [
- "Ġ",
- "p"
- ],
- [
- "i",
- "t"
- ],
- [
- "a",
- "l"
- ],
- [
- "ãĢ",
- "Ĥ"
- ],
- [
- "Ġ",
- "o"
- ],
- [
- "Ġ",
- "w"
- ],
- [
- "ä",
- "»"
- ],
- [
- "Ġt",
- "o"
- ],
- [
- "i",
- "s"
- ],
- [
- "o",
- "u"
- ],
- [
- "Ġ",
- "m"
- ],
- [
- "ä",
- "º"
- ],
- [
- "Ġ",
- "in"
- ],
- [
- "Ġ",
- "f"
- ],
- [
- "Ġ",
- "b"
- ],
- [
- "e",
- "d"
- ],
- [
- "i",
- "on"
- ],
- [
- "å",
- "ı"
- ],
- [
- "i",
- "c"
- ],
- [
- "Ġ",
- "d"
- ],
- [
- "Ġo",
- "f"
- ],
- [
- "l",
- "e"
- ],
- [
- "a",
- "r"
- ],
- [
- "r",
- "o"
- ],
- [
- "Ġ",
- "Ġ"
- ],
- [
- "å",
- "ħ"
- ],
- [
- "en",
- "t"
- ],
- [
- "æ",
- "ľ"
- ],
- [
- "Ġ",
- "e"
- ],
- [
- "å",
- "Ĵ"
- ],
- [
- "è",
- "¿"
- ],
- [
- "ä",
- "½"
- ],
- [
- "åĴ",
- "Į"
- ],
- [
- "æ",
- "Ī"
- ],
- [
- "å",
- "®"
- ],
- [
- "å",
- "Ī"
- ],
- [
- "v",
- "e"
- ],
- [
- "u",
- "s"
- ],
- [
- "Ġ",
- "re"
- ],
- [
- "Ġ",
- "h"
- ],
- [
- "Ġt",
- "h"
- ],
- [
- "a",
- "s"
- ],
- [
- "c",
- "t"
- ],
- [
- "ç",
- "Ķ"
- ],
- [
- "o",
- "m"
- ],
- [
- "å",
- "ľ"
- ],
- [
- "å",
- "¤"
- ],
- [
- "æ",
- "ĺ"
- ],
- [
- "å",
- "Ĭ"
- ],
- [
- "å",
- "IJ"
- ],
- [
- "ä¸",
- "Ģ"
- ],
- [
- "i",
- "m"
- ],
- [
- "è",
- "¯"
- ],
- [
- "æ",
- "ĸ"
- ],
- [
- "at",
- "ion"
- ],
- [
- "l",
- "o"
- ],
- [
- "ç",
- "»"
- ],
- [
- "Ġb",
- "e"
- ],
- [
- "ãĢ",
- "ģ"
- ],
- [
- "i",
- "d"
- ],
- [
- "Ġc",
- "an"
- ],
- [
- "i",
- "l"
- ],
- [
- "æĺ",
- "¯"
- ],
- [
- "ä",
- "¹"
- ],
- [
- "è",
- "®"
- ],
- [
- "Ġ",
- "A"
- ],
- [
- "Ġth",
- "at"
- ],
- [
- "Ġ",
- "T"
- ],
- [
- "ä»",
- "¥"
- ],
- [
- "c",
- "h"
- ],
- [
- "Ġ",
- "y"
- ],
- [
- "c",
- "e"
- ],
- [
- "ï¼",
- "ļ"
- ],
- [
- "o",
- "t"
- ],
- [
- "er",
- "s"
- ],
- [
- "Ġ",
- "n"
- ],
- [
- "é",
- "Ģ"
- ],
- [
- "r",
- "a"
- ],
- [
- "å",
- "°"
- ],
- [
- "Ġ",
- "g"
- ],
- [
- "Ġy",
- "ou"
- ],
- [
- "å",
- "Ń"
- ],
- [
- "Ġp",
- "ro"
- ],
- [
- "e",
- "t"
- ],
- [
- "å",
- "º"
- ],
- [
- "åľ",
- "¨"
- ],
- [
- "l",
- "y"
- ],
- [
- "Ġ",
- "is"
- ],
- [
- "ä¸",
- "ª"
- ],
- [
- "Ġ",
- "l"
- ],
- [
- "u",
- "r"
- ],
- [
- "Ġf",
- "or"
- ],
- [
- "åı",
- "¯"
- ],
- [
- "é",
- "ĩ"
- ],
- [
- "s",
- "t"
- ],
- [
- "çļĦ",
- "æ"
- ],
- [
- "u",
- "t"
- ],
- [
- "Ġ",
- "he"
- ],
- [
- "i",
- "f"
- ],
- [
- "ĥ",
- "½"
- ],
- [
- "ä",
- "¼"
- ],
- [
- "Ġ",
- "I"
- ],
- [
- "è",
- "¡"
- ],
- [
- "i",
- "r"
- ],
- [
- "it",
- "h"
- ],
- [
- "å",
- "¹"
- ],
- [
- "Ġa",
- "re"
- ],
- [
- "i",
- "g"
- ],
- [
- "Ġs",
- "t"
- ],
- [
- "e",
- "l"
- ],
- [
- "o",
- "l"
- ],
- [
- "å",
- "¸"
- ],
- [
- "u",
- "l"
- ],
- [
- "æ",
- "Ŀ"
- ],
- [
- "æĪ",
- "ij"
- ],
- [
- "Ġ",
- "on"
- ],
- [
- "è",
- "¦"
- ],
- [
- "æľ",
- "ī"
- ],
- [
- "æ",
- "Ĺ"
- ],
- [
- "å",
- "¯"
- ],
- [
- "è",
- "§"
- ],
- [
- "è¦",
- "ģ"
- ],
- [
- "Ġ",
- "us"
- ],
- [
- "a",
- "y"
- ],
- [
- "æ",
- "ķ"
- ],
- [
- "ç",
- "ī"
- ],
- [
- "o",
- "w"
- ],
- [
- "m",
- "ent"
- ],
- [
- "çĶ",
- "¨"
- ],
- [
- "es",
- "s"
- ],
- [
- "ä¸",
- "Ń"
- ],
- [
- "ä»",
- "¬"
- ],
- [
- "äº",
- "º"
- ],
- [
- "å",
- "ĩ"
- ],
- [
- "Ġe",
- "x"
- ],
- [
- "ĠĠ",
- "ĠĠ"
- ],
- [
- "å",
- "Ľ"
- ],
- [
- "å",
- "Į"
- ],
- [
- "å",
- "¼"
- ],
- [
- "Ġc",
- "on"
- ],
- [
- "s",
- "e"
- ],
- [
- "è",
- "ĥ½"
- ],
- [
- "ç",
- "İ"
- ],
- [
- "Ġa",
- "n"
- ],
- [
- "Ġw",
- "ith"
- ],
- [
- "ä¸",
- "º"
- ],
- [
- "at",
- "e"
- ],
- [
- "i",
- "v"
- ],
- [
- "a",
- "m"
- ],
- [
- "Ġa",
- "s"
- ],
- [
- "u",
- "re"
- ],
- [
- "è¿",
- "Ļ"
- ],
- [
- "å",
- "Ĩ"
- ],
- [
- "ç",
- "Ń"
- ],
- [
- "Ġ",
- "or"
- ],
- [
- "å",
- "·"
- ],
- [
- "Ġa",
- "l"
- ],
- [
- "i",
- "es"
- ],
- [
- "ç",
- "§"
- ],
- [
- "Ġ",
- "im"
- ],
- [
- "æ",
- "Ģ"
- ],
- [
- "v",
- "er"
- ],
- [
- "a",
- "b"
- ],
- [
- "äº",
- "Ĩ"
- ],
- [
- "Ġs",
- "u"
- ],
- [
- "Ġd",
- "e"
- ],
- [
- "g",
- "e"
- ],
- [
- "t",
- "h"
- ],
- [
- "åı¯",
- "以"
- ],
- [
- "è",
- "Ģ"
- ],
- [
- "ä¸",
- "į"
- ],
- [
- "å",
- "¾"
- ],
- [
- "ĠA",
- "I"
- ],
- [
- "Ġ",
- "en"
- ],
- [
- "é",
- "Ĺ"
- ],
- [
- "æ",
- "ī"
- ],
- [
- "a",
- "k"
- ],
- [
- "i",
- "ve"
- ],
- [
- "Ġm",
- "o"
- ],
- [
- "å",
- "¥"
- ],
- [
- "é",
- "Ŀ"
- ],
- [
- "ç",
- "Ľ"
- ],
- [
- "it",
- "y"
- ],
- [
- "ä",
- "¿"
- ],
- [
- "u",
- "n"
- ],
- [
- "è",
- "´"
- ],
- [
- "å",
- "į"
- ],
- [
- "Ġ",
- "it"
- ],
- [
- "Ġim",
- "p"
- ],
- [
- "e",
- "ct"
- ],
- [
- "æ",
- "ł"
- ],
- [
- "å",
- "½"
- ],
- [
- "è",
- "ĩ"
- ],
- [
- "é",
- "¢"
- ],
- [
- "å",
- "ĵ"
- ],
- [
- "æ",
- "³"
- ],
- [
- "or",
- "t"
- ],
- [
- "a",
- "d"
- ],
- [
- "æ",
- "ŀ"
- ],
- [
- "e",
- "m"
- ],
- [
- "Ġc",
- "om"
- ],
- [
- "å",
- "¦"
- ],
- [
- "he",
- "r"
- ],
- [
- "e",
- "re"
- ],
- [
- "Ġ",
- "S"
- ],
- [
- "i",
- "al"
- ],
- [
- "Ġ",
- "C"
- ],
- [
- "ĠT",
- "he"
- ],
- [
- "ç",
- "IJ"
- ],
- [
- "çĶ",
- "Ł"
- ],
- [
- "æ",
- "Ħ"
- ],
- [
- "p",
- "p"
- ],
- [
- "æ",
- "Ń"
- ],
- [
- "æĸ",
- "¹"
- ],
- [
- "q",
- "u"
- ],
- [
- "Ġw",
- "h"
- ],
- [
- "å¦",
- "Ĥ"
- ],
- [
- "é",
- "ľ"
- ],
- [
- "an",
- "t"
- ],
- [
- "Ġ",
- "le"
- ],
- [
- "Ġ",
- "v"
- ],
- [
- "æ",
- "ĭ"
- ],
- [
- "æ",
- "Ĭ"
- ],
- [
- "us",
- "t"
- ],
- [
- "æĹ",
- "¶"
- ],
- [
- "çŃ",
- "ī"
- ],
- [
- "å",
- "ij"
- ],
- [
- "å¯",
- "¹"
- ],
- [
- "t",
- "er"
- ],
- [
- "l",
- "d"
- ],
- [
- "è¡",
- "Į"
- ],
- [
- "Ġc",
- "h"
- ],
- [
- "u",
- "d"
- ],
- [
- "éľ",
- "Ģ"
- ],
- [
- "æ",
- "°"
- ],
- [
- "æĪ",
- "IJ"
- ],
- [
- "Ġ",
- "|"
- ],
- [
- "a",
- "c"
- ],
- [
- "a",
- "in"
- ],
- [
- "i",
- "z"
- ],
- [
- "æ",
- "ı"
- ],
- [
- "ion",
- "s"
- ],
- [
- "Ġh",
- "a"
- ],
- [
- "æ",
- "Ľ"
- ],
- [
- "-",
- "-"
- ],
- [
- "æĿ",
- "¥"
- ],
- [
- "om",
- "e"
- ],
- [
- "å",
- "¿"
- ],
- [
- "'",
- "s"
- ],
- [
- "Ġn",
- "e"
- ],
- [
- "es",
- "t"
- ],
- [
- "ä",
- "¾"
- ],
- [
- "u",
- "m"
- ],
- [
- "åĪ",
- "°"
- ],
- [
- "åľ",
- "°"
- ],
- [
- "is",
- "t"
- ],
- [
- "â",
- "Ģ"
- ],
- [
- "çī",
- "©"
- ],
- [
- "ä¸Ģ",
- "个"
- ],
- [
- "l",
- "p"
- ],
- [
- "æ",
- "İ"
- ],
- [
- "èĩ",
- "ª"
- ],
- [
- "Ġhe",
- "lp"
- ],
- [
- "Ġthe",
- "ir"
- ],
- [
- "æ",
- "Ķ"
- ],
- [
- "ä½",
- "ľ"
- ],
- [
- "ä¼",
- "ļ"
- ],
- [
- "æ",
- "Į"
- ],
- [
- "æĪij",
- "们"
- ],
- [
- "n",
- "t"
- ],
- [
- "äº",
- "İ"
- ],
- [
- "åĪ",
- "Ĩ"
- ],
- [
- "re",
- "s"
- ],
- [
- "p",
- "e"
- ],
- [
- "åĩ",
- "º"
- ],
- [
- "id",
- "e"
- ],
- [
- "æ",
- "ĥ"
- ],
- [
- "Ġ",
- "H"
- ],
- [
- "è",
- "¾"
- ],
- [
- "Ġ",
- "M"
- ],
- [
- "f",
- "f"
- ],
- [
- "æ",
- "¯"
- ],
- [
- "o",
- "d"
- ],
- [
- "ic",
- "al"
- ],
- [
- "Ġw",
- "or"
- ],
- [
- "ä¸",
- "Ĭ"
- ],
- [
- "a",
- "re"
- ],
- [
- "æĽ",
- "´"
- ],
- [
- "Ġyou",
- "r"
- ],
- [
- "ä¸",
- "ĭ"
- ],
- [
- "è",
- "µ"
- ],
- [
- "ation",
- "s"
- ],
- [
- "æķ",
- "°"
- ],
- [
- "Ġt",
- "e"
- ],
- [
- "å",
- "İ"
- ],
- [
- "çIJ",
- "Ĩ"
- ],
- [
- "ĠT",
- "h"
- ],
- [
- "è¿",
- "ĩ"
- ],
- [
- "å¹",
- "¶"
- ],
- [
- "d",
- "u"
- ],
- [
- "éĿ",
- "¢"
- ],
- [
- "Ġa",
- "d"
- ],
- [
- "il",
- "l"
- ],
- [
- "æ",
- "µ"
- ],
- [
- "å¥",
- "½"
- ],
- [
- "o",
- "c"
- ],
- [
- "a",
- "ct"
- ],
- [
- "éľĢ",
- "è¦ģ"
- ],
- [
- "ä»",
- "ĸ"
- ],
- [
- "å",
- "±"
- ],
- [
- "Ġ",
- "r"
- ],
- [
- "Ġmo",
- "re"
- ],
- [
- "åŃ",
- "¦"
- ],
- [
- "ç",
- "®"
- ],
- [
- "ig",
- "h"
- ],
- [
- "äº",
- "Ľ"
- ],
- [
- "Ġ",
- "B"
- ],
- [
- "åĬ",
- "¨"
- ],
- [
- "åĵ",
- "ģ"
- ],
- [
- "è",
- "ī"
- ],
- [
- "p",
- "le"
- ],
- [
- "Ġin",
- "c"
- ],
- [
- "åIJ",
- "Į"
- ],
- [
- "Ġex",
- "p"
- ],
- [
- "ou",
- "ld"
- ],
- [
- "ä½",
- "ł"
- ],
- [
- "æ",
- "į"
- ],
- [
- "æı",
- "IJ"
- ],
- [
- "å¤",
- "§"
- ],
- [
- "çİ",
- "°"
- ],
- [
- "p",
- "t"
- ],
- [
- "Ġ",
- "P"
- ],
- [
- "al",
- "l"
- ],
- [
- "åĬ",
- "ł"
- ],
- [
- "ç§",
- "į"
- ],
- [
- "Ġs",
- "e"
- ],
- [
- "åĬ",
- "Ľ"
- ],
- [
- "ou",
- "t"
- ],
- [
- "Ġha",
- "ve"
- ],
- [
- "ç",
- "º"
- ],
- [
- "ä½",
- "ĵ"
- ],
- [
- "Ġpro",
- "v"
- ],
- [
- "åĮ",
- "ĸ"
- ],
- [
- "å¤",
- "ļ"
- ],
- [
- "å®",
- "ļ"
- ],
- [
- "Ġus",
- "ed"
- ],
- [
- "éĢ",
- "ļ"
- ],
- [
- "c",
- "c"
- ],
- [
- "è¿",
- "Ľ"
- ],
- [
- "æ",
- "´"
- ],
- [
- "Ġs",
- "h"
- ],
- [
- "Ġa",
- "b"
- ],
- [
- "o",
- "s"
- ],
- [
- "Ġre",
- "s"
- ],
- [
- "ĠTh",
- "is"
- ],
- [
- "ç",
- "¨"
- ],
- [
- "æĢ",
- "§"
- ],
- [
- "a",
- "ge"
- ],
- [
- "r",
- "i"
- ],
- [
- "æ",
- "¸"
- ],
- [
- "ab",
- "le"
- ],
- [
- "åŃ",
- "IJ"
- ],
- [
- "Ġb",
- "y"
- ],
- [
- "åı",
- "ij"
- ],
- [
- "éĩ",
- "ı"
- ],
- [
- "åº",
- "Ķ"
- ],
- [
- "Ġ",
- "lo"
- ],
- [
- "ä½",
- "¿"
- ],
- [
- "åħ",
- "¶"
- ],
- [
- "é",
- "«"
- ],
- [
- "é",
- "Ļ"
- ],
- [
- "é«",
- "ĺ"
- ],
- [
- "åº",
- "¦"
- ],
- [
- "è§",
- "£"
- ],
- [
- "é",
- "£"
- ],
- [
- "å°",
- "Ĩ"
- ],
- [
- "æ³",
- "ķ"
- ],
- [
- "a",
- "nd"
- ],
- [
- "ä¿",
- "Ŀ"
- ],
- [
- "an",
- "s"
- ],
- [
- "f",
- "or"
- ],
- [
- "ro",
- "m"
- ],
- [
- "re",
- "at"
- ],
- [
- "Ġp",
- "l"
- ],
- [
- "çļĦ",
- "ç"
- ],
- [
- "å¸",
- "¸"
- ],
- [
- "è",
- "½"
- ],
- [
- "Ġw",
- "e"
- ],
- [
- "è¡",
- "¨"
- ],
- [
- "ak",
- "e"
- ],
- [
- "æĪ",
- "ĸ"
- ],
- [
- "é¢",
- "ĺ"
- ],
- [
- "å",
- "Ł"
- ],
- [
- "Ġm",
- "e"
- ],
- [
- "æĸ",
- "ĩ"
- ],
- [
- "t",
- "her"
- ],
- [
- "k",
- "e"
- ],
- [
- "å®",
- "¶"
- ],
- [
- "åIJ",
- "Ī"
- ],
- [
- "æľ",
- "Ģ"
- ],
- [
- "in",
- "e"
- ],
- [
- "Ġs",
- "ome"
- ],
- [
- "ç",
- "±"
- ],
- [
- "éĩ",
- "į"
- ],
- [
- "æŀ",
- "ľ"
- ],
- [
- "Ġ",
- "W"
- ],
- [
- "Ġ",
- "E"
- ],
- [
- "é",
- "ĺ"
- ],
- [
- "ou",
- "r"
- ],
- [
- "r",
- "ou"
- ],
- [
- "ç",
- "Ĥ"
- ],
- [
- "æ",
- "±"
- ],
- [
- "åħ",
- "³"
- ],
- [
- "Ġin",
- "t"
- ],
- [
- "an",
- "ce"
- ],
- [
- "ä¹",
- "Ł"
- ],
- [
- "é",
- "ģ"
- ],
- [
- "ĠĠ",
- "Ġ"
- ],
- [
- "å®",
- "ĥ"
- ],
- [
- "a",
- "g"
- ],
- [
- "æ",
- "¬"
- ],
- [
- "0",
- "0"
- ],
- [
- "è",
- "°"
- ],
- [
- "ul",
- "t"
- ],
- [
- "y",
- "st"
- ],
- [
- "éĹ",
- "´"
- ],
- [
- "ç",
- "³"
- ],
- [
- "Ġt",
- "r"
- ],
- [
- "p",
- "l"
- ],
- [
- "ar",
- "t"
- ],
- [
- "æĦ",
- "Ł"
- ],
- [
- "æ",
- "Ĥ"
- ],
- [
- "at",
- "a"
- ],
- [
- "Ġ",
- "F"
- ],
- [
- "for",
- "m"
- ],
- [
- "è®",
- "¡"
- ],
- [
- "Ġf",
- "rom"
- ],
- [
- "Ġ",
- "D"
- ],
- [
- "éĹ",
- "®"
- ],
- [
- "igh",
- "t"
- ],
- [
- "c",
- "es"
- ],
- [
- "æį",
- "®"
- ],
- [
- "lo",
- "p"
- ],
- [
- "ä¹",
- "ĭ"
- ],
- [
- "Ġf",
- "e"
- ],
- [
- "å",
- "ģ"
- ],
- [
- "ve",
- "lop"
- ],
- [
- "Ġ",
- "1"
- ],
- [
- "åĽ",
- "ł"
- ],
- [
- "k",
- "s"
- ],
- [
- "æ",
- "²"
- ],
- [
- "Ġ",
- "u"
- ],
- [
- "å°",
- "ı"
- ],
- [
- "yst",
- "em"
- ],
- [
- "Ġd",
- "is"
- ],
- [
- "Ġ",
- "R"
- ],
- [
- "g",
- "y"
- ],
- [
- "å·",
- "¥"
- ],
- [
- "ç¨",
- "ĭ"
- ],
- [
- "å",
- "¢"
- ],
- [
- "en",
- "ce"
- ],
- [
- "è",
- "Ĥ"
- ],
- [
- "ç",
- "¡"
- ],
- [
- "Ġt",
- "ra"
- ],
- [
- "å",
- "»"
- ],
- [
- "åħ",
- "¥"
- ],
- [
- "ig",
- "n"
- ],
- [
- "al",
- "th"
- ],
- [
- "Ġsu",
- "ch"
- ],
- [
- "a",
- "ch"
- ],
- [
- "æ",
- "Ļ"
- ],
- [
- "ar",
- "n"
- ],
- [
- "Ġd",
- "ata"
- ],
- [
- "è",
- "¶"
- ],
- [
- "å®",
- "ŀ"
- ],
- [
- "s",
- "o"
- ],
- [
- "Ġde",
- "velop"
- ],
- [
- "ç",
- "¤"
- ],
- [
- "Ġa",
- "cc"
- ],
- [
- "as",
- "t"
- ],
- [
- "èĢ",
- "Į"
- ],
- [
- "Ġ",
- "\""
- ],
- [
- "Ġo",
- "ther"
- ],
- [
- "å»",
- "º"
- ],
- [
- "Ġe",
- "ff"
- ],
- [
- "ç",
- "«"
- ],
- [
- "Ġm",
- "an"
- ],
- [
- "åħ",
- "¬"
- ],
- [
- "å",
- "Ģ"
- ],
- [
- "ç",
- "Ħ"
- ],
- [
- "m",
- "s"
- ],
- [
- "å¼",
- "ı"
- ],
- [
- "èī",
- "²"
- ],
- [
- "å¾",
- "Ĺ"
- ],
- [
- "if",
- "ic"
- ],
- [
- "Ġ",
- "j"
- ],
- [
- "Ġ",
- "ro"
- ],
- [
- "Ġh",
- "as"
- ],
- [
- "ch",
- "n"
- ],
- [
- "o",
- "lo"
- ],
- [
- "åĪ",
- "¶"
- ],
- [
- "è",
- "Ĭ"
- ],
- [
- "使",
- "ç͍"
- ],
- [
- "ou",
- "s"
- ],
- [
- "u",
- "al"
- ],
- [
- "Ġa",
- "t"
- ],
- [
- "Ġe",
- "m"
- ],
- [
- "el",
- "l"
- ],
- [
- "Ġs",
- "ystem"
- ],
- [
- "Ġhe",
- "alth"
- ],
- [
- "it",
- "ies"
- ],
- [
- "Ġex",
- "am"
- ],
- [
- "i",
- "b"
- ],
- [
- "é",
- "Ķ"
- ],
- [
- "Ġab",
- "out"
- ],
- [
- "äº",
- "§"
- ],
- [
- "åIJ",
- "İ"
- ],
- [
- "æĦ",
- "ı"
- ],
- [
- "ç±",
- "»"
- ],
- [
- "Ġp",
- "re"
- ],
- [
- "æĤ",
- "¨"
- ],
- [
- "Ġal",
- "so"
- ],
- [
- "ent",
- "s"
- ],
- [
- "Ġin",
- "d"
- ],
- [
- "in",
- "d"
- ],
- [
- "éĢ",
- "Ĥ"
- ],
- [
- "Ġte",
- "chn"
- ],
- [
- "res",
- "s"
- ],
- [
- "æĥ",
- "ħ"
- ],
- [
- "éĹ®",
- "é¢ĺ"
- ],
- [
- "Ġus",
- "e"
- ],
- [
- "ï¼",
- "Ł"
- ],
- [
- "Ġinc",
- "l"
- ],
- [
- "Ġs",
- "pe"
- ],
- [
- "ic",
- "h"
- ],
- [
- "p",
- "s"
- ],
- [
- "æľ",
- "º"
- ],
- [
- "Ġthe",
- "y"
- ],
- [
- "i",
- "e"
- ],
- [
- "Ġh",
- "ow"
- ],
- [
- "Ġwor",
- "k"
- ],
- [
- "ä¸",
- "ļ"
- ],
- [
- "ç",
- "´"
- ],
- [
- "Ġimp",
- "ro"
- ],
- [
- "Ġle",
- "arn"
- ],
- [
- "æĸ",
- "°"
- ],
- [
- "çĤ",
- "¹"
- ],
- [
- "Ġcon",
- "t"
- ],
- [
- "ar",
- "d"
- ],
- [
- "çĦ",
- "¶"
- ],
- [
- "æľ",
- "¬"
- ],
- [
- "ç³",
- "»"
- ],
- [
- "ç¡",
- "®"
- ],
- [
- "è®",
- "¾"
- ],
- [
- "åħ",
- "·"
- ],
- [
- "éĢ",
- "ī"
- ],
- [
- "èĢ",
- "ħ"
- ],
- [
- "é",
- "ħ"
- ],
- [
- "g",
- "h"
- ],
- [
- "_",
- "_"
- ],
- [
- "Ġn",
- "ot"
- ],
- [
- "ç",
- "ľ"
- ],
- [
- "çĽ",
- "¸"
- ],
- [
- "Ġprov",
- "ide"
- ],
- [
- "å",
- "ī"
- ],
- [
- "ion",
- "al"
- ],
- [
- "Ġen",
- "s"
- ],
- [
- "ä¸",
- "İ"
- ],
- [
- "è´",
- "¨"
- ],
- [
- "ent",
- "ial"
- ],
- [
- "ç»",
- "ı"
- ],
- [
- "å¿",
- "ĥ"
- ],
- [
- "an",
- "g"
- ],
- [
- "æŃ",
- "¤"
- ],
- [
- "e",
- "nd"
- ],
- [
- "Ġp",
- "o"
- ],
- [
- "è¿Ľ",
- "è¡Į"
- ],
- [
- "ic",
- "e"
- ],
- [
- "Ġ",
- "-"
- ],
- [
- "Ġw",
- "ay"
- ],
- [
- "å·",
- "±"
- ],
- [
- "Ġ",
- "2"
- ],
- [
- "im",
- "e"
- ],
- [
- "ç",
- "½"
- ],
- [
- "èĩª",
- "å·±"
- ],
- [
- "Ġ",
- "un"
- ],
- [
- "b",
- "ot"
- ],
- [
- "Ġincl",
- "ud"
- ],
- [
- "at",
- "ed"
- ],
- [
- "æ°",
- "´"
- ],
- [
- "é",
- "ķ"
- ],
- [
- "æĮ",
- "ģ"
- ],
- [
- "ä»",
- "£"
- ],
- [
- "é",
- "¡"
- ],
- [
- "æī",
- "Ģ"
- ],
- [
- "ç",
- "Ŀ"
- ],
- [
- "pp",
- "ort"
- ],
- [
- "o",
- "od"
- ],
- [
- "i",
- "ke"
- ],
- [
- "r",
- "u"
- ],
- [
- "Ġcom",
- "m"
- ],
- [
- "Ġ",
- "L"
- ],
- [
- "ä¿",
- "¡"
- ],
- [
- "Ġ",
- "G"
- ],
- [
- "ç",
- "Ł"
- ],
- [
- "çĶ",
- "µ"
- ],
- [
- "Ġw",
- "as"
- ],
- [
- "lo",
- "w"
- ],
- [
- "er",
- "v"
- ],
- [
- "åĮ",
- "ħ"
- ],
- [
- "ĠĠĠĠ",
- "ĠĠĠĠ"
- ],
- [
- "Ġw",
- "he"
- ],
- [
- "d",
- "it"
- ],
- [
- "Ġwh",
- "ich"
- ],
- [
- "Ġcom",
- "p"
- ],
- [
- "é",
- "ª"
- ],
- [
- "o",
- "re"
- ],
- [
- "ç",
- "¾"
- ],
- [
- "Ġ",
- "="
- ],
- [
- "çī",
- "¹"
- ],
- [
- "if",
- "f"
- ],
- [
- "er",
- "t"
- ],
- [
- "æ",
- "ģ"
- ],
- [
- "r",
- "it"
- ],
- [
- "Ġre",
- "c"
- ],
- [
- "åĨ",
- "ħ"
- ],
- [
- "æĺ",
- "İ"
- ],
- [
- "or",
- "s"
- ],
- [
- "Ġp",
- "at"
- ],
- [
- "--",
- "--"
- ],
- [
- "æ",
- "Ł"
- ],
- [
- "Ġa",
- "pp"
- ],
- [
- "n",
- "s"
- ],
- [
- "åĬ",
- "¡"
- ],
- [
- "al",
- "y"
- ],
- [
- "a",
- "ce"
- ],
- [
- "æ´",
- "»"
- ],
- [
- "ä¾",
- "Ľ"
- ],
- [
- "a",
- "v"
- ],
- [
- "ä¸",
- "»"
- ],
- [
- "Ġp",
- "ers"
- ],
- [
- "ç",
- "ĥ"
- ],
- [
- "è¯",
- "¥"
- ],
- [
- "Ġm",
- "y"
- ],
- [
- "ç",
- "©"
- ],
- [
- "er",
- "i"
- ],
- [
- "è®",
- "©"
- ],
- [
- "æĬ",
- "Ģ"
- ],
- [
- "éķ",
- "¿"
- ],
- [
- "ac",
- "k"
- ],
- [
- "Ġ",
- "N"
- ],
- [
- "Ġd",
- "iff"
- ],
- [
- "Ġth",
- "is"
- ],
- [
- "å",
- "Ŀ"
- ],
- [
- "Ġens",
- "ure"
- ],
- [
- "å½",
- "ĵ"
- ],
- [
- "Ġo",
- "ut"
- ],
- [
- "Ġc",
- "l"
- ],
- [
- "Ġ",
- "k"
- ],
- [
- "é",
- "¦"
- ],
- [
- "ou",
- "nt"
- ],
- [
- "çİ",
- "¯"
- ],
- [
- "åĬ",
- "©"
- ],
- [
- "Ġtechn",
- "olo"
- ],
- [
- "Ġthe",
- "se"
- ],
- [
- "f",
- "ul"
- ],
- [
- "é",
- "ļ"
- ],
- [
- "æ",
- "·"
- ],
- [
- "ä¸Ģ",
- "äºĽ"
- ],
- [
- "Ġs",
- "oc"
- ],
- [
- "å¼",
- "Ģ"
- ],
- [
- "å¤",
- "©"
- ],
- [
- "Ġe",
- "v"
- ],
- [
- "Ġre",
- "du"
- ],
- [
- "Ġthe",
- "m"
- ],
- [
- "Ġ",
- "("
- ],
- [
- "é",
- "ĥ½"
- ],
- [
- "æĪ",
- "·"
- ],
- [
- "è",
- "·"
- ],
- [
- "åľ",
- "º"
- ],
- [
- "æ°",
- "Ķ"
- ],
- [
- "Ġ",
- "Y"
- ],
- [
- "è¯",
- "Ń"
- ],
- [
- "éĢļ",
- "è¿ĩ"
- ],
- [
- "å±",
- "ķ"
- ],
- [
- "Ġc",
- "o"
- ],
- [
- "å½",
- "±"
- ],
- [
- "ç",
- "¬"
- ],
- [
- "Ġan",
- "aly"
- ],
- [
- "æ¯",
- "Ķ"
- ],
- [
- "åħ",
- "¨"
- ],
- [
- "Ġimpro",
- "ve"
- ],
- [
- "ç»",
- "ĵ"
- ],
- [
- "å¹",
- "´"
- ],
- [
- "ç",
- "ķ"
- ],
- [
- "çĿ",
- "Ģ"
- ],
- [
- "Ġh",
- "um"
- ],
- [
- "Ġ",
- "qu"
- ],
- [
- "ç®",
- "Ĺ"
- ],
- [
- "Ġ",
- "O"
- ],
- [
- "é£",
- "Ł"
- ],
- [
- "il",
- "ity"
- ],
- [
- "Ġsystem",
- "s"
- ],
- [
- "åı",
- "ĺ"
- ],
- [
- "a",
- "il"
- ],
- [
- "ç",
- "¼"
- ],
- [
- "ç",
- "ł"
- ],
- [
- "è¿Ļ",
- "个"
- ],
- [
- "æıIJ",
- "ä¾Ľ"
- ],
- [
- "as",
- "e"
- ],
- [
- "å",
- "ŀ"
- ],
- [
- "ment",
- "s"
- ],
- [
- "Ġp",
- "ot"
- ],
- [
- "Ġan",
- "y"
- ],
- [
- "ä½",
- "Ĩ"
- ],
- [
- "Ġcon",
- "s"
- ],
- [
- "ĠI",
- "t"
- ],
- [
- "æł",
- "¼"
- ],
- [
- "Ġa",
- "r"
- ],
- [
- "æľ",
- "¯"
- ],
- [
- "éĿ",
- "ŀ"
- ],
- [
- "Ġd",
- "o"
- ],
- [
- "Ġm",
- "ay"
- ],
- [
- "æĭ",
- "©"
- ],
- [
- "u",
- "e"
- ],
- [
- "éĢī",
- "æĭ©"
- ],
- [
- "r",
- "y"
- ],
- [
- "é",
- "ĥ"
- ],
- [
- "Ġl",
- "ike"
- ],
- [
- "on",
- "g"
- ],
- [
- "è",
- "ģ"
- ],
- [
- "`",
- "`"
- ],
- [
- "i",
- "le"
- ],
- [
- "æ±",
- "Ĥ"
- ],
- [
- "Ġne",
- "w"
- ],
- [
- "i",
- "ent"
- ],
- [
- "Ġimp",
- "act"
- ],
- [
- "è¿",
- "ĺ"
- ],
- [
- "æ³",
- "¨"
- ],
- [
- "ä¹",
- "Ī"
- ],
- [
- "çĽ",
- "®"
- ],
- [
- "âĢ",
- "ľ"
- ],
- [
- "âĢ",
- "Ŀ"
- ],
- [
- "e",
- "f"
- ],
- [
- "ä¾",
- "ĭ"
- ],
- [
- "Ġpot",
- "ential"
- ],
- [
- "o",
- "k"
- ],
- [
- "åı¯",
- "èĥ½"
- ],
- [
- "Ġtr",
- "ans"
- ],
- [
- "Ġa",
- "ct"
- ],
- [
- "ï¼",
- "ī"
- ],
- [
- "Ġspe",
- "c"
- ],
- [
- "æ",
- "¶"
- ],
- [
- "Ġw",
- "ill"
- ],
- [
- "äº",
- "¤"
- ],
- [
- "iz",
- "e"
- ],
- [
- "ç¾",
- "İ"
- ],
- [
- "å¸",
- "Ĥ"
- ],
- [
- "Ġst",
- "ud"
- ],
- [
- "p",
- "on"
- ],
- [
- "è",
- "º"
- ],
- [
- "ä¸į",
- "åIJĮ"
- ],
- [
- "on",
- "e"
- ],
- [
- "å¾",
- "Ī"
- ],
- [
- "åı",
- "Ĭ"
- ],
- [
- "å¦Ĥ",
- "æŀľ"
- ],
- [
- "çIJ",
- "ĥ"
- ],
- [
- "an",
- "ge"
- ],
- [
- "Ġne",
- "ed"
- ],
- [
- "å¤",
- "ĸ"
- ],
- [
- "et",
- "y"
- ],
- [
- "ak",
- "ing"
- ],
- [
- "è¯",
- "·"
- ],
- [
- "at",
- "er"
- ],
- [
- "Ġpers",
- "on"
- ],
- [
- "id",
- "ent"
- ],
- [
- "Ġs",
- "o"
- ],
- [
- "Ġm",
- "ake"
- ],
- [
- "å¹",
- "³"
- ],
- [
- "å¤",
- "Ł"
- ],
- [
- "èº",
- "«"
- ],
- [
- "ï¼",
- "Ī"
- ],
- [
- "Ġin",
- "form"
- ],
- [
- "æ",
- "¡"
- ],
- [
- "äº",
- "ĭ"
- ],
- [
- "åı",
- "Ĺ"
- ],
- [
- "as",
- "ed"
- ],
- [
- "il",
- "d"
- ],
- [
- "Ġof",
- "f"
- ],
- [
- "Ġthe",
- "re"
- ],
- [
- "c",
- "is"
- ],
- [
- "è",
- "¢"
- ],
- [
- "éĥ",
- "¨"
- ],
- [
- "æ¯",
- "ı"
- ],
- [
- "ra",
- "ct"
- ],
- [
- "as",
- "s"
- ],
- [
- "Ġlearn",
- "ing"
- ],
- [
- "å",
- "ĸ"
- ],
- [
- "å½",
- "¢"
- ],
- [
- "i",
- "re"
- ],
- [
- "ä»",
- "İ"
- ],
- [
- "bot",
- "s"
- ],
- [
- "è",
- "Ļ"
- ],
- [
- "å¸",
- "®"
- ],
- [
- "Ġd",
- "es"
- ],
- [
- "ĠI",
- "n"
- ],
- [
- "c",
- "ess"
- ],
- [
- "Ġp",
- "e"
- ],
- [
- "if",
- "y"
- ],
- [
- "Ġwh",
- "o"
- ],
- [
- "ä¹",
- "ł"
- ],
- [
- "æľ",
- "Ł"
- ],
- [
- "Ġexp",
- "eri"
- ],
- [
- "é",
- "Ĥ"
- ],
- [
- "Ġs",
- "c"
- ],
- [
- "e",
- "p"
- ],
- [
- "ä½",
- "ķ"
- ],
- [
- "Ġt",
- "ime"
- ],
- [
- "éĿŀ",
- "常"
- ],
- [
- "æĭ",
- "¬"
- ],
- [
- "å",
- "ķ"
- ],
- [
- "以",
- "ä¸ĭ"
- ],
- [
- "éģ",
- "ĵ"
- ],
- [
- "Ġcomm",
- "un"
- ],
- [
- "Ġc",
- "ould"
- ],
- [
- "a",
- "p"
- ],
- [
- "è",
- "IJ"
- ],
- [
- "è°",
- "ĥ"
- ],
- [
- "l",
- "ic"
- ],
- [
- "du",
- "ct"
- ],
- [
- "Ġit",
- "s"
- ],
- [
- "c",
- "y"
- ],
- [
- "è¯",
- "´"
- ],
- [
- "Ġm",
- "ed"
- ],
- [
- "Ġc",
- "ol"
- ],
- [
- "ul",
- "ar"
- ],
- [
- "éĩį",
- "è¦ģ"
- ],
- [
- "Ġs",
- "p"
- ],
- [
- "åĪ",
- "©"
- ],
- [
- "èµ",
- "·"
- ],
- [
- "Ġprov",
- "id"
- ],
- [
- "ic",
- "es"
- ],
- [
- "å",
- "Ļ"
- ],
- [
- "æĸ",
- "Ļ"
- ],
- [
- "Ġimp",
- "ort"
- ],
- [
- "ur",
- "al"
- ],
- [
- "åŃ",
- "Ĺ"
- ],
- [
- "Ġu",
- "nd"
- ],
- [
- "in",
- "t"
- ],
- [
- "Ġo",
- "ver"
- ],
- [
- "åı",
- "¸"
- ],
- [
- "æł",
- "¹"
- ],
- [
- "é",
- "¥"
- ],
- [
- "pl",
- "es"
- ],
- [
- "ä»ĸ",
- "们"
- ],
- [
- "g",
- "ra"
- ],
- [
- "ur",
- "ing"
- ],
- [
- "n",
- "ow"
- ],
- [
- "åį",
- "ķ"
- ],
- [
- "è¿Ļ",
- "äºĽ"
- ],
- [
- "åī",
- "į"
- ],
- [
- "å®",
- "ī"
- ],
- [
- "Ġp",
- "r"
- ],
- [
- "åĮħ",
- "æĭ¬"
- ],
- [
- "ç»",
- "Ļ"
- ],
- [
- "T",
- "he"
- ],
- [
- "ä½",
- "į"
- ],
- [
- "å",
- "§"
- ],
- [
- "ç´",
- "ł"
- ],
- [
- "åij",
- "ĺ"
- ],
- [
- "Ġ",
- "ident"
- ],
- [
- "åŀ",
- "ĭ"
- ],
- [
- "Ġad",
- "d"
- ],
- [
- "å¼",
- "º"
- ],
- [
- "æĺ¯",
- "ä¸Ģ"
- ],
- [
- "i",
- "p"
- ],
- [
- "g",
- "or"
- ],
- [
- "Ġsu",
- "pport"
- ],
- [
- "n",
- "e"
- ],
- [
- "Ġdiff",
- "ere"
- ],
- [
- "åħ",
- "ĥ"
- ],
- [
- "Ġas",
- "s"
- ],
- [
- "åĨ",
- "³"
- ],
- [
- "é",
- "Ľ"
- ],
- [
- "åIJ",
- "į"
- ],
- [
- "Ġg",
- "o"
- ],
- [
- "Ġtechnolo",
- "gy"
- ],
- [
- "æĢ",
- "»"
- ],
- [
- "è®",
- "®"
- ],
- [
- "Ġin",
- "ter"
- ],
- [
- "Ġin",
- "v"
- ],
- [
- "Ġo",
- "ur"
- ],
- [
- "æķ",
- "Ī"
- ],
- [
- "ust",
- "om"
- ],
- [
- "Ġre",
- "l"
- ],
- [
- "if",
- "e"
- ],
- [
- "åĻ",
- "¨"
- ],
- [
- "ing",
- "s"
- ],
- [
- "ä»",
- "·"
- ],
- [
- "Ġp",
- "art"
- ],
- [
- "è¢",
- "«"
- ],
- [
- "æī",
- "ĭ"
- ],
- [
- "ar",
- "y"
- ],
- [
- "Ġres",
- "pon"
- ],
- [
- "Ċ",
- "ĠĠĠ"
- ],
- [
- "好",
- "çļĦ"
- ],
- [
- "at",
- "ive"
- ],
- [
- "帮",
- "åĬ©"
- ],
- [
- "ç»",
- "Ł"
- ],
- [
- "æĶ",
- "¾"
- ],
- [
- "ĠH",
- "ere"
- ],
- [
- "ç",
- "ģ"
- ],
- [
- "Ġb",
- "ut"
- ],
- [
- "æģ",
- "¯"
- ],
- [
- "æŃ",
- "£"
- ],
- [
- "ar",
- "k"
- ],
- [
- "åħ¬",
- "åı¸"
- ],
- [
- "or",
- "y"
- ],
- [
- "å¢",
- "ĥ"
- ],
- [
- "le",
- "ct"
- ],
- [
- "é",
- "Ł"
- ],
- [
- "æĥ",
- "³"
- ],
- [
- "é£",
- "İ"
- ],
- [
- "at",
- "ing"
- ],
- [
- "Ġa",
- "m"
- ],
- [
- "it",
- "s"
- ],
- [
- "æ",
- "»"
- ],
- [
- "gor",
- "ith"
- ],
- [
- "åĵ",
- "į"
- ],
- [
- "ure",
- "s"
- ],
- [
- "Ġeff",
- "ect"
- ],
- [
- "Ġsh",
- "ould"
- ],
- [
- "Ġp",
- "er"
- ],
- [
- "è",
- "±"
- ],
- [
- "ç",
- "²"
- ],
- [
- "ic",
- "t"
- ],
- [
- "Ġal",
- "gorith"
- ],
- [
- "u",
- "c"
- ],
- [
- "rou",
- "gh"
- ],
- [
- "ä»",
- "»"
- ],
- [
- "ä»",
- "¶"
- ],
- [
- "Ġbe",
- "t"
- ],
- [
- "i",
- "a"
- ],
- [
- "Ġanaly",
- "z"
- ],
- [
- "æł¹",
- "æį®"
- ],
- [
- "iz",
- "ed"
- ],
- [
- "æµ",
- "ģ"
- ],
- [
- "è§",
- "Ĥ"
- ],
- [
- "è",
- "£"
- ],
- [
- "æł",
- "ĩ"
- ],
- [
- "ir",
- "on"
- ],
- [
- "Ġc",
- "ustom"
- ],
- [
- "Ġre",
- "g"
- ],
- [
- "Ġperson",
- "al"
- ],
- [
- "èĥ½",
- "å¤Ł"
- ],
- [
- "ic",
- "s"
- ],
- [
- "iv",
- "id"
- ],
- [
- "ç",
- "Ī"
- ],
- [
- "èµ",
- "Ħ"
- ],
- [
- "æŃ",
- "¥"
- ],
- [
- "å®",
- "¹"
- ],
- [
- "åĪ",
- "Ľ"
- ],
- [
- "è",
- "Ī"
- ],
- [
- "ä¹",
- "IJ"
- ],
- [
- "å¯",
- "¼"
- ],
- [
- "g",
- "an"
- ],
- [
- "èĬ",
- "Ĥ"
- ],
- [
- "Ġal",
- "l"
- ],
- [
- "en",
- "s"
- ],
- [
- "am",
- "e"
- ],
- [
- "n",
- "ess"
- ],
- [
- "Ġu",
- "p"
- ],
- [
- "Ġ",
- "U"
- ],
- [
- "èĢ",
- "ĥ"
- ],
- [
- "el",
- "f"
- ],
- [
- "åĢ",
- "¼"
- ],
- [
- "å°",
- "ij"
- ],
- [
- "æľ",
- "į"
- ],
- [
- "ar",
- "i"
- ],
- [
- "th",
- "ical"
- ],
- [
- "v",
- "iron"
- ],
- [
- "è",
- "ĥ"
- ],
- [
- "or",
- "d"
- ],
- [
- "Ġs",
- "ign"
- ],
- [
- "éĩ",
- "Į"
- ],
- [
- "ou",
- "nd"
- ],
- [
- "o",
- "ple"
- ],
- [
- "åŁ",
- "º"
- ],
- [
- "Ġinform",
- "ation"
- ],
- [
- "Ġident",
- "ify"
- ],
- [
- "åĽ",
- "ŀ"
- ],
- [
- "Ġc",
- "re"
- ],
- [
- "éŁ",
- "³"
- ],
- [
- "ib",
- "le"
- ],
- [
- "u",
- "b"
- ],
- [
- "è¿",
- "IJ"
- ],
- [
- "Ġle",
- "ad"
- ],
- [
- "æ¸",
- "¸"
- ],
- [
- "æ¬",
- "¡"
- ],
- [
- "åĨ",
- "Ļ"
- ],
- [
- "éĤ",
- "£"
- ],
- [
- "g",
- "et"
- ],
- [
- "è",
- "į"
- ],
- [
- "Ġexam",
- "ple"
- ],
- [
- "ä¼",
- "ĺ"
- ],
- [
- "å½±",
- "åĵį"
- ],
- [
- "is",
- "h"
- ],
- [
- "x",
- "t"
- ],
- [
- "æ",
- "º"
- ],
- [
- "éª",
- "Į"
- ],
- [
- "o",
- "b"
- ],
- [
- "å®",
- "¢"
- ],
- [
- "å¤",
- "ĩ"
- ],
- [
- "åģ",
- "¥"
- ],
- [
- "è½",
- "¦"
- ],
- [
- "ç¤",
- "¾"
- ],
- [
- "ivid",
- "ual"
- ],
- [
- "ere",
- "d"
- ],
- [
- "l",
- "es"
- ],
- [
- "Ġen",
- "viron"
- ],
- [
- "Ġpe",
- "ople"
- ],
- [
- "æĺ",
- "Ł"
- ],
- [
- "ç",
- "ĸ"
- ],
- [
- "ç",
- "ĭ"
- ],
- [
- "Ġd",
- "et"
- ],
- [
- "æĹ",
- "ł"
- ],
- [
- "Ġ",
- "if"
- ],
- [
- "o",
- "se"
- ],
- [
- "it",
- "e"
- ],
- [
- "å¢",
- "ŀ"
- ],
- [
- "é",
- "Ĵ"
- ],
- [
- "åIJĮ",
- "æĹ¶"
- ],
- [
- "è¿",
- "°"
- ],
- [
- "æĸ¹",
- "å¼ı"
- ],
- [
- "åĽ",
- "½"
- ],
- [
- "é",
- "»"
- ],
- [
- "å¤",
- "Ħ"
- ],
- [
- "Ġexam",
- "ples"
- ],
- [
- "æ",
- "®"
- ],
- [
- "Ġint",
- "o"
- ],
- [
- "æĮ",
- "ĩ"
- ],
- [
- "Ġhum",
- "an"
- ],
- [
- "åIJ",
- "ij"
- ],
- [
- "ç¤",
- "º"
- ],
- [
- "æķ°",
- "æį®"
- ],
- [
- "Ġ",
- "3"
- ],
- [
- "Ġ",
- "J"
- ],
- [
- "è",
- "ı"
- ],
- [
- "çݯ",
- "å¢ĥ"
- ],
- [
- "al",
- "s"
- ],
- [
- "ers",
- "t"
- ],
- [
- "Ġe",
- "thical"
- ],
- [
- "ç»",
- "Ħ"
- ],
- [
- "ä¼",
- "ł"
- ],
- [
- "Ġdiffere",
- "nt"
- ],
- [
- "Ġk",
- "now"
- ],
- [
- "åº",
- "ı"
- ],
- [
- "Ġind",
- "ividual"
- ],
- [
- "æıIJ",
- "é«ĺ"
- ],
- [
- "rou",
- "nd"
- ],
- [
- "å°",
- "±"
- ],
- [
- "åı",
- "ĸ"
- ],
- [
- "åŃ",
- "ĺ"
- ],
- [
- "ä¸",
- "¤"
- ],
- [
- "çŁ",
- "¥"
- ],
- [
- "our",
- "ces"
- ],
- [
- "c",
- "k"
- ],
- [
- "å",
- "£"
- ],
- [
- "in",
- "es"
- ],
- [
- "è¾",
- "¾"
- ],
- [
- "Ġman",
- "y"
- ],
- [
- "æķ",
- "´"
- ],
- [
- "æł",
- "·"
- ],
- [
- "dit",
- "ional"
- ],
- [
- "om",
- "m"
- ],
- [
- "çĶ",
- "±"
- ],
- [
- "éĢ",
- "ł"
- ],
- [
- "å®ĥ",
- "们"
- ],
- [
- "u",
- "es"
- ],
- [
- "Ġm",
- "ent"
- ],
- [
- "Ġimport",
- "ant"
- ],
- [
- "Ġo",
- "pt"
- ],
- [
- "Ġlo",
- "c"
- ],
- [
- "p",
- "h"
- ],
- [
- "Ġpro",
- "cess"
- ],
- [
- "Ġalgorith",
- "ms"
- ],
- [
- "设",
- "计"
- ],
- [
- "Ġsoc",
- "ial"
- ],
- [
- "ver",
- "y"
- ],
- [
- "åĪ",
- "Ļ"
- ],
- [
- "ä¾ĭ",
- "å¦Ĥ"
- ],
- [
- "è®",
- "¤"
- ],
- [
- "Ġa",
- "ut"
- ],
- [
- "Ġs",
- "erv"
- ],
- [
- "g",
- "g"
- ],
- [
- "产",
- "åĵģ"
- ],
- [
- "è§",
- "Ħ"
- ],
- [
- "çľ",
- "ĭ"
- ],
- [
- "ve",
- "l"
- ],
- [
- "æĸ¹",
- "æ³ķ"
- ],
- [
- "Ġb",
- "en"
- ],
- [
- "åĽł",
- "æŃ¤"
- ],
- [
- "c",
- "are"
- ],
- [
- "p",
- "er"
- ],
- [
- "åĬ",
- "Ł"
- ],
- [
- "建",
- "è®®"
- ],
- [
- "Ġp",
- "os"
- ],
- [
- "æ",
- "¤"
- ],
- [
- "w",
- "e"
- ],
- [
- "åĮ",
- "º"
- ],
- [
- "i",
- "qu"
- ],
- [
- "Ġre",
- "al"
- ],
- [
- "æĹ",
- "¥"
- ],
- [
- "Ġredu",
- "ce"
- ],
- [
- "a",
- "f"
- ],
- [
- "ang",
- "u"
- ],
- [
- "Ġs",
- "k"
- ],
- [
- "Ġ",
- "ed"
- ],
- [
- "erst",
- "and"
- ],
- [
- "åĨ",
- "µ"
- ],
- [
- "m",
- "ot"
- ],
- [
- "åħ",
- "Ī"
- ],
- [
- "ç",
- "¥"
- ],
- [
- "åºĶ",
- "该"
- ],
- [
- "Ġth",
- "rough"
- ],
- [
- "Ġcon",
- "c"
- ],
- [
- "åıij",
- "å±ķ"
- ],
- [
- "è¯",
- "ķ"
- ],
- [
- "æ¡",
- "Ī"
- ],
- [
- "Ġenviron",
- "ment"
- ],
- [
- "åı",
- "£"
- ],
- [
- "Ġad",
- "v"
- ],
- [
- "åĪ",
- "«"
- ],
- [
- "Ġben",
- "ef"
- ],
- [
- "æ¸",
- "ħ"
- ],
- [
- "åij",
- "³"
- ],
- [
- "åħ",
- "ī"
- ],
- [
- "Ġdevelop",
- "ment"
- ],
- [
- "en",
- "g"
- ],
- [
- "å¦Ĥ",
- "ä½ķ"
- ],
- [
- "ç®",
- "¡"
- ],
- [
- "iv",
- "ers"
- ],
- [
- "åIJ",
- "Ħ"
- ],
- [
- "Ġr",
- "is"
- ],
- [
- "ro",
- "w"
- ],
- [
- "er",
- "gy"
- ],
- [
- "计",
- "ç®Ĺ"
- ],
- [
- "ä¿¡",
- "æģ¯"
- ],
- [
- "Ġpro",
- "duct"
- ],
- [
- "è¾",
- "ĥ"
- ],
- [
- "è®",
- "º"
- ],
- [
- "èĩªå·±",
- "çļĦ"
- ],
- [
- "æĬ",
- "¤"
- ],
- [
- "åı",
- "į"
- ],
- [
- "åħ¶",
- "ä»ĸ"
- ],
- [
- "åĪ",
- "Ĺ"
- ],
- [
- "ç»",
- "Ĩ"
- ],
- [
- "ç©",
- "º"
- ],
- [
- "Ġg",
- "reat"
- ],
- [
- "e",
- "ar"
- ],
- [
- "æº",
- "IJ"
- ],
- [
- "j",
- "ect"
- ],
- [
- "çĶŁ",
- "æ´»"
- ],
- [
- "ä¸Ń",
- "çļĦ"
- ],
- [
- "Ġund",
- "erstand"
- ],
- [
- "è",
- "ĭ"
- ],
- [
- "h",
- "at"
- ],
- [
- "Ġpro",
- "gra"
- ],
- [
- "ç",
- "Ĭ"
- ],
- [
- "éĩ",
- "ij"
- ],
- [
- "Ġinclud",
- "ing"
- ],
- [
- "Ġacc",
- "ess"
- ],
- [
- "ĠĠĠĠ",
- "ĠĠĠ"
- ],
- [
- "è¯",
- "Ĩ"
- ],
- [
- "ç",
- "¦"
- ],
- [
- "o",
- "g"
- ],
- [
- "è£",
- "ħ"
- ],
- [
- "Ġar",
- "t"
- ],
- [
- "Ġw",
- "rit"
- ],
- [
- "Ġinc",
- "re"
- ],
- [
- "Ġp",
- "h"
- ],
- [
- "æĸ¹",
- "éĿ¢"
- ],
- [
- "Ġp",
- "ract"
- ],
- [
- "Ġus",
- "ing"
- ],
- [
- "é¡",
- "¹"
- ],
- [
- "æİ",
- "¥"
- ],
- [
- "Ġway",
- "s"
- ],
- [
- "Ġl",
- "angu"
- ],
- [
- "æĶ",
- "¯"
- ],
- [
- "Ġch",
- "all"
- ],
- [
- "åİ",
- "»"
- ],
- [
- "__",
- "__"
- ],
- [
- "im",
- "ate"
- ],
- [
- "æĸ",
- "Ń"
- ],
- [
- "è",
- "¨"
- ],
- [
- "Ġw",
- "ell"
- ],
- [
- "l",
- "l"
- ],
- [
- "Ġp",
- "ol"
- ],
- [
- "æĢ",
- "ģ"
- ],
- [
- "Ġ",
- "ra"
- ],
- [
- "C",
- "an"
- ],
- [
- "åİ",
- "Ł"
- ],
- [
- "b",
- "er"
- ],
- [
- "è¨",
- "Ģ"
- ],
- [
- "ç«",
- "ĭ"
- ],
- [
- "Ġg",
- "en"
- ],
- [
- "éħ",
- "į"
- ],
- [
- "æ·",
- "±"
- ],
- [
- "t",
- "e"
- ],
- [
- "ä¸",
- "ī"
- ],
- [
- "ç§",
- "ij"
- ],
- [
- "ĠF",
- "or"
- ],
- [
- "çº",
- "¿"
- ],
- [
- "ç",
- "ħ"
- ],
- [
- "æ",
- "¼"
- ],
- [
- "åķ",
- "Ĩ"
- ],
- [
- "æĿ",
- "IJ"
- ],
- [
- "Ġsign",
- "ific"
- ],
- [
- "Ġg",
- "u"
- ],
- [
- "Ġde",
- "cis"
- ],
- [
- "Ġtra",
- "in"
- ],
- [
- "Ġa",
- "g"
- ],
- [
- "Ġc",
- "reat"
- ],
- [
- "å®",
- "Į"
- ],
- [
- "æĹ¶",
- "éĹ´"
- ],
- [
- "Ġon",
- "e"
- ],
- [
- "è",
- "Ħ"
- ],
- [
- "Ġn",
- "at"
- ],
- [
- "åѦ",
- "ä¹ł"
- ],
- [
- "çļĦæ",
- "ķ"
- ],
- [
- "c",
- "ed"
- ],
- [
- "Ġwhe",
- "n"
- ],
- [
- "Ġb",
- "i"
- ],
- [
- "è",
- "İ"
- ],
- [
- "æĽ´",
- "åĬł"
- ],
- [
- "iv",
- "es"
- ],
- [
- "p",
- "ort"
- ],
- [
- "å·¥",
- "ä½ľ"
- ],
- [
- "v",
- "ing"
- ],
- [
- "Ġbe",
- "en"
- ],
- [
- "æĻ",
- "º"
- ],
- [
- "Ġl",
- "ife"
- ],
- [
- "å¼",
- "ķ"
- ],
- [
- "ar",
- "m"
- ],
- [
- "çİ",
- "ĩ"
- ],
- [
- "ç͍",
- "æĪ·"
- ],
- [
- "ä¹",
- "ī"
- ],
- [
- "ä»",
- "½"
- ],
- [
- "è¯",
- "Ŀ"
- ],
- [
- "in",
- "ess"
- ],
- [
- "c",
- "om"
- ],
- [
- "åº",
- "·"
- ],
- [
- "åĩ",
- "ı"
- ],
- [
- "ä»",
- "Ģ"
- ],
- [
- "è¾",
- "ĵ"
- ],
- [
- "Ġv",
- "ari"
- ],
- [
- "c",
- "on"
- ],
- [
- "Ġmo",
- "d"
- ],
- [
- "ä»Ģ",
- "ä¹Ī"
- ],
- [
- "Ġen",
- "ergy"
- ],
- [
- "æĬĢ",
- "æľ¯"
- ],
- [
- "ert",
- "ain"
- ],
- [
- "m",
- "m"
- ],
- [
- "ver",
- "all"
- ],
- [
- "åĪ",
- "Ĵ"
- ],
- [
- "Ġro",
- "bots"
- ],
- [
- "Ġor",
- "gan"
- ],
- [
- "æİ",
- "¨"
- ],
- [
- "ant",
- "s"
- ],
- [
- "åĩ",
- "Ĩ"
- ],
- [
- "d",
- "s"
- ],
- [
- "æŀ",
- "ģ"
- ],
- [
- "ç",
- "Ļ"
- ],
- [
- "Ġre",
- "qu"
- ],
- [
- "Ġ",
- "ess"
- ],
- [
- "ç®",
- "Ģ"
- ],
- [
- "ust",
- "ain"
- ],
- [
- "æ",
- "¨"
- ],
- [
- "Ġst",
- "r"
- ],
- [
- "c",
- "ing"
- ],
- [
- "ab",
- "ility"
- ],
- [
- "re",
- "e"
- ],
- [
- "Ġed",
- "uc"
- ],
- [
- "åİ",
- "Ĩ"
- ],
- [
- "Ġcre",
- "ate"
- ],
- [
- "åģ¥",
- "康"
- ],
- [
- "Ġdes",
- "ign"
- ],
- [
- "i",
- "ps"
- ],
- [
- "åģ",
- "ļ"
- ],
- [
- "èĬ",
- "±"
- ],
- [
- "in",
- "k"
- ],
- [
- "èı",
- "ľ"
- ],
- [
- "æī",
- "¾"
- ],
- [
- "æ®",
- "µ"
- ],
- [
- "æµ",
- "ĭ"
- ],
- [
- "Ġ",
- "V"
- ],
- [
- "ĠB",
- "y"
- ],
- [
- "å",
- "Ķ"
- ],
- [
- "é¦",
- "ĸ"
- ],
- [
- "è¯",
- "į"
- ],
- [
- "Ġwhe",
- "re"
- ],
- [
- "Ġdis",
- "c"
- ],
- [
- "äºĨ",
- "è§£"
- ],
- [
- "r",
- "ic"
- ],
- [
- "ä¸",
- "Ķ"
- ],
- [
- "è¶",
- "³"
- ],
- [
- "æĺ¯",
- "ä¸Ģ个"
- ],
- [
- "ar",
- "ch"
- ],
- [
- "ç§",
- "¯"
- ],
- [
- "å¸",
- "¦"
- ],
- [
- "Ġwh",
- "ile"
- ],
- [
- "Ġsignific",
- "ant"
- ],
- [
- "çł",
- "ģ"
- ],
- [
- "æĪ",
- "¿"
- ],
- [
- "Ġbe",
- "ing"
- ],
- [
- "Ġlangu",
- "age"
- ],
- [
- "it",
- "ive"
- ],
- [
- "2",
- "0"
- ],
- [
- "Ġanalyz",
- "e"
- ],
- [
- "æĻ",
- "¯"
- ],
- [
- "è",
- "Į"
- ],
- [
- "ri",
- "b"
- ],
- [
- "æ¨",
- "¡"
- ],
- [
- "ĠS",
- "t"
- ],
- [
- "è´",
- "¹"
- ],
- [
- "'",
- "t"
- ],
- [
- "Ġhealth",
- "care"
- ],
- [
- "Ġexperi",
- "ence"
- ],
- [
- "Ġ",
- "5"
- ],
- [
- "个",
- "人"
- ],
- [
- "ay",
- "s"
- ],
- [
- "è±",
- "¡"
- ],
- [
- "p",
- "lo"
- ],
- [
- "Ġw",
- "ould"
- ],
- [
- "èĻ",
- "ij"
- ],
- [
- "æĶ",
- "¶"
- ],
- [
- "é¢",
- "Ħ"
- ],
- [
- "é¢",
- "Ĩ"
- ],
- [
- "ä¿Ŀ",
- "æĮģ"
- ],
- [
- "en",
- "ces"
- ],
- [
- "åı",
- "ª"
- ],
- [
- "èĩ",
- "´"
- ],
- [
- "æĪ",
- "ı"
- ],
- [
- "Ġment",
- "al"
- ],
- [
- "Ġfe",
- "w"
- ],
- [
- "at",
- "es"
- ],
- [
- "è¿ĩ",
- "ç¨ĭ"
- ],
- [
- "å®ī",
- "åħ¨"
- ],
- [
- "Ġs",
- "ustain"
- ],
- [
- "Ġw",
- "ere"
- ],
- [
- "å¤",
- "ª"
- ],
- [
- "ç",
- "Į"
- ],
- [
- "Ġspec",
- "ific"
- ],
- [
- "Ġwor",
- "ld"
- ],
- [
- "çŃ",
- "Ķ"
- ],
- [
- "``",
- "`"
- ],
- [
- "Ġt",
- "ake"
- ],
- [
- "åħ",
- "»"
- ],
- [
- "éĢ",
- "Ł"
- ],
- [
- "e",
- "ver"
- ],
- [
- "S",
- "S"
- ],
- [
- "éĶ",
- "Ģ"
- ],
- [
- "Ġb",
- "o"
- ],
- [
- "he",
- "s"
- ],
- [
- "Ġm",
- "us"
- ],
- [
- "æľį",
- "åĬ¡"
- ],
- [
- "è§",
- "Ĵ"
- ],
- [
- "t",
- "en"
- ],
- [
- "æŀ",
- "IJ"
- ],
- [
- "p",
- "ow"
- ],
- [
- "d",
- "ict"
- ],
- [
- "v",
- "ent"
- ],
- [
- "1",
- "0"
- ],
- [
- "çļĦæ",
- "Ĺ"
- ],
- [
- "ĸ",
- "çķ"
- ],
- [
- "Ġpro",
- "t"
- ],
- [
- "ç½",
- "®"
- ],
- [
- "Ġh",
- "igh"
- ],
- [
- "Ġb",
- "us"
- ],
- [
- "Ġind",
- "ust"
- ],
- [
- "åIJ",
- "¦"
- ],
- [
- "c",
- "ial"
- ],
- [
- "人",
- "们"
- ],
- [
- "ĠA",
- "s"
- ],
- [
- "åij",
- "Ĭ"
- ],
- [
- "ad",
- "e"
- ],
- [
- "æĶ",
- "¹"
- ],
- [
- "ç",
- "Ĺ"
- ],
- [
- "Ġh",
- "ad"
- ],
- [
- "Ġhe",
- "r"
- ],
- [
- "Ġj",
- "ust"
- ],
- [
- "ï¼",
- "Ľ"
- ],
- [
- "è´",
- "Ń"
- ],
- [
- "ç¬",
- "¬"
- ],
- [
- "é",
- "ĵ"
- ],
- [
- "Ġw",
- "ater"
- ],
- [
- "Ġf",
- "ood"
- ],
- [
- "éĺ",
- "Ł"
- ],
- [
- "a",
- "us"
- ],
- [
- "Ġchall",
- "eng"
- ],
- [
- "åħ",
- "į"
- ],
- [
- "æĸĩ",
- "åĮĸ"
- ],
- [
- "Ġmo",
- "st"
- ],
- [
- "é",
- "¸"
- ],
- [
- "ç½",
- "ij"
- ],
- [
- "çĽ",
- "´"
- ],
- [
- "Ġs",
- "m"
- ],
- [
- "Ġact",
- "iv"
- ],
- [
- "plo",
- "y"
- ],
- [
- "O",
- "verall"
- ],
- [
- "å¿",
- "«"
- ],
- [
- "ru",
- "ct"
- ],
- [
- "Ġindividual",
- "s"
- ],
- [
- "å§",
- "ĭ"
- ],
- [
- "g",
- "ies"
- ],
- [
- "æŁ",
- "¥"
- ],
- [
- "çĪ",
- "±"
- ],
- [
- "i",
- "ety"
- ],
- [
- "I",
- "n"
- ],
- [
- "åĪĨ",
- "æŀIJ"
- ],
- [
- "è§",
- "Ĩ"
- ],
- [
- "æ¸",
- "©"
- ],
- [
- "ç»",
- "´"
- ],
- [
- "ol",
- "ut"
- ],
- [
- "åŁ",
- "Ł"
- ],
- [
- "omm",
- "end"
- ],
- [
- "Ġcom",
- "ple"
- ],
- [
- "æķ",
- "Ļ"
- ],
- [
- "Ġb",
- "u"
- ],
- [
- "Ġeduc",
- "ation"
- ],
- [
- "at",
- "her"
- ],
- [
- "Ġ",
- "4"
- ],
- [
- "t",
- "ing"
- ],
- [
- "Ġf",
- "ind"
- ],
- [
- "æ²",
- "¡"
- ],
- [
- "Ġh",
- "is"
- ],
- [
- "ä¹ĭ",
- "éĹ´"
- ],
- [
- "Ġeffect",
- "ive"
- ],
- [
- "Ġat",
- "t"
- ],
- [
- "Ġre",
- "se"
- ],
- [
- "èĥ½",
- "åĬĽ"
- ],
- [
- "åŁ",
- "İ"
- ],
- [
- "Ġal",
- "low"
- ],
- [
- "Ġa",
- "v"
- ],
- [
- "Ġpro",
- "mot"
- ],
- [
- "æĻº",
- "èĥ½"
- ],
- [
- "æ»",
- "¡"
- ],
- [
- "åħ",
- "±"
- ],
- [
- "ie",
- "w"
- ],
- [
- "c",
- "ome"
- ],
- [
- "ç³»",
- "绣"
- ],
- [
- "Ġrespon",
- "s"
- ],
- [
- "äº",
- "Ĵ"
- ],
- [
- "Ġc",
- "ult"
- ],
- [
- "pow",
- "ered"
- ],
- [
- "Ġrec",
- "ommend"
- ],
- [
- "èIJ",
- "¥"
- ],
- [
- "O",
- "SS"
- ],
- [
- "Ġch",
- "ange"
- ],
- [
- "è¯",
- "ģ"
- ],
- [
- "v",
- "ed"
- ],
- [
- "æİ",
- "Ĵ"
- ],
- [
- "è§£",
- "åĨ³"
- ],
- [
- "ic",
- "i"
- ],
- [
- "ĠH",
- "ow"
- ],
- [
- "Ġfe",
- "el"
- ],
- [
- "æľ",
- "Ī"
- ],
- [
- "Ġwh",
- "at"
- ],
- [
- "以",
- "åıĬ"
- ],
- [
- "Ġse",
- "e"
- ],
- [
- "åŃ",
- "©"
- ],
- [
- "b",
- "s"
- ],
- [
- "Ġs",
- "ur"
- ],
- [
- "æ",
- "£"
- ],
- [
- "al",
- "ity"
- ],
- [
- "Ġv",
- "is"
- ],
- [
- "ç¡®",
- "ä¿Ŀ"
- ],
- [
- "p",
- "ect"
- ],
- [
- "å®ŀ",
- "çݰ"
- ],
- [
- "Ġc",
- "are"
- ],
- [
- "å¹",
- "¿"
- ],
- [
- "ill",
- "s"
- ],
- [
- "åº",
- "Ń"
- ],
- [
- "as",
- "es"
- ],
- [
- "å¤",
- "į"
- ],
- [
- "åºĶ",
- "ç͍"
- ],
- [
- "çļĦæ",
- "ĥ"
- ],
- [
- "ard",
- "s"
- ],
- [
- "Ġadd",
- "ress"
- ],
- [
- "Ġcomp",
- "an"
- ],
- [
- "Ġinv",
- "ol"
- ],
- [
- "Ġcustom",
- "er"
- ],
- [
- "åĽł",
- "为"
- ],
- [
- "Ġstud",
- "ents"
- ],
- [
- "Ġin",
- "s"
- ],
- [
- "注",
- "æĦı"
- ],
- [
- "æŀ",
- "Ħ"
- ],
- [
- "æ¬",
- "¢"
- ],
- [
- "æµ",
- "·"
- ],
- [
- "åı",
- "Ĥ"
- ],
- [
- "èĩª",
- "çĦ¶"
- ],
- [
- "é",
- "©"
- ],
- [
- "ĠThe",
- "se"
- ],
- [
- "w",
- "n"
- ],
- [
- "æĺ",
- "ĵ"
- ],
- [
- "çĬ",
- "¶"
- ],
- [
- "re",
- "n"
- ],
- [
- "Ġt",
- "reat"
- ],
- [
- "Ġbenef",
- "its"
- ],
- [
- "Ċ",
- "ĠĠĠĠĠĠĠ"
- ],
- [
- "对",
- "äºİ"
- ],
- [
- "æĢ",
- "Ŀ"
- ],
- [
- "id",
- "er"
- ],
- [
- "ĠY",
- "es"
- ],
- [
- "Ġ",
- "K"
- ],
- [
- "åĸ",
- "ľ"
- ],
- [
- "Ġ",
- "ke"
- ],
- [
- "Ġen",
- "g"
- ],
- [
- "Ġpo",
- "p"
- ],
- [
- "o",
- "st"
- ],
- [
- "p",
- "are"
- ],
- [
- "Ġm",
- "on"
- ],
- [
- "æ¬",
- "¾"
- ],
- [
- "ĠM",
- "OSS"
- ],
- [
- "Ġem",
- "ot"
- ],
- [
- "Ġa",
- "c"
- ],
- [
- "ç¼",
- "ĸ"
- ],
- [
- "f",
- "ore"
- ],
- [
- "åı",
- "¥"
- ],
- [
- "Ġv",
- "al"
- ],
- [
- "il",
- "y"
- ],
- [
- "Ġis",
- "s"
- ],
- [
- "èĤ",
- "ī"
- ],
- [
- "èĩ",
- "³"
- ],
- [
- "游",
- "æĪı"
- ],
- [
- "we",
- "en"
- ],
- [
- "Ġinclud",
- "e"
- ],
- [
- "Ġprot",
- "ect"
- ],
- [
- "åħ³",
- "ç³»"
- ],
- [
- "éĻ",
- "©"
- ],
- [
- "Ġse",
- "ver"
- ],
- [
- "Ġth",
- "an"
- ],
- [
- "éľĢ",
- "æ±Ĥ"
- ],
- [
- "ç»",
- "ĥ"
- ],
- [
- "ĠThe",
- "y"
- ],
- [
- "is",
- "s"
- ],
- [
- "y",
- "s"
- ],
- [
- "Ġj",
- "ob"
- ],
- [
- "éĺ",
- "³"
- ],
- [
- "æ",
- "IJ"
- ],
- [
- "Ġbet",
- "ween"
- ],
- [
- "Ġm",
- "ach"
- ],
- [
- "----",
- "----"
- ],
- [
- "èĢĥ",
- "èĻij"
- ],
- [
- "è´¨",
- "éĩı"
- ],
- [
- "Ġbus",
- "iness"
- ],
- [
- "w",
- "or"
- ],
- [
- "ic",
- "k"
- ],
- [
- "e",
- "g"
- ],
- [
- "åħ",
- "ħ"
- ],
- [
- "ç",
- "¯"
- ],
- [
- "æĿ",
- "¡"
- ],
- [
- "n",
- "er"
- ],
- [
- "a",
- "pt"
- ],
- [
- "Ġapp",
- "ro"
- ],
- [
- "Ġpl",
- "ay"
- ],
- [
- "没",
- "æľī"
- ],
- [
- "¤",
- "IJ"
- ],
- [
- "æľ",
- "ª"
- ],
- [
- "æĪ",
- "ĺ"
- ],
- [
- "å®¶",
- "åºŃ"
- ],
- [
- "ãĢ",
- "ĭ"
- ],
- [
- "en",
- "cy"
- ],
- [
- "ĠC",
- "h"
- ],
- [
- "ãĢ",
- "Ĭ"
- ],
- [
- "Ġprovid",
- "ing"
- ],
- [
- "Ġres",
- "ources"
- ],
- [
- "âĢ",
- "Ļ"
- ],
- [
- "Ġass",
- "ist"
- ],
- [
- "Ġnat",
- "ural"
- ],
- [
- "è¯",
- "Ħ"
- ],
- [
- "ä¾",
- "¿"
- ],
- [
- "Ġs",
- "af"
- ],
- [
- "åħ·",
- "æľī"
- ],
- [
- "è°",
- "¢"
- ],
- [
- "çĥ",
- "Ń"
- ],
- [
- "s",
- "s"
- ],
- [
- "et",
- "h"
- ],
- [
- "ol",
- "d"
- ],
- [
- "Ġper",
- "form"
- ],
- [
- "Ġsever",
- "al"
- ],
- [
- "é",
- "¤IJ"
- ],
- [
- "Ġe",
- "ach"
- ],
- [
- "è½",
- "¬"
- ],
- [
- "c",
- "i"
- ],
- [
- "Ġt",
- "y"
- ],
- [
- "Ġp",
- "ub"
- ],
- [
- "æ´»",
- "åĬ¨"
- ],
- [
- "oc",
- "us"
- ],
- [
- "çī",
- "Į"
- ],
- [
- "è¶",
- "Ĭ"
- ],
- [
- "åĽ",
- "¢"
- ],
- [
- "è½",
- "»"
- ],
- [
- "è¯Ń",
- "è¨Ģ"
- ],
- [
- "Ġare",
- "as"
- ],
- [
- "éĩ",
- "ĩ"
- ],
- [
- "f",
- "t"
- ],
- [
- "ri",
- "end"
- ],
- [
- "å·",
- "²"
- ],
- [
- "å¸Ĥ",
- "åľº"
- ],
- [
- "it",
- "ion"
- ],
- [
- "i",
- "ents"
- ],
- [
- "管",
- "çIJĨ"
- ],
- [
- "è®",
- "¸"
- ],
- [
- "人",
- "ç±»"
- ],
- [
- "身",
- "ä½ĵ"
- ],
- [
- "iqu",
- "e"
- ],
- [
- "Ġpart",
- "ic"
- ],
- [
- "ç»",
- "Ń"
- ],
- [
- "age",
- "ment"
- ],
- [
- "v",
- "es"
- ],
- [
- "ç¬",
- "¦"
- ],
- [
- "l",
- "ine"
- ],
- [
- "çº",
- "¢"
- ],
- [
- "åIJ",
- "¸"
- ],
- [
- "Ġpat",
- "ter"
- ],
- [
- "00",
- "0"
- ],
- [
- "社",
- "ä¼ļ"
- ],
- [
- "åĨħ",
- "容"
- ],
- [
- "Ġorgan",
- "iz"
- ],
- [
- "ou",
- "gh"
- ],
- [
- "Ġ",
- "ve"
- ],
- [
- "åŃ©",
- "åŃIJ"
- ],
- [
- "æĸ",
- "½"
- ],
- [
- "æ¤",
- "į"
- ],
- [
- "åĩ",
- "ł"
- ],
- [
- "ä½Ĩ",
- "æĺ¯"
- ],
- [
- "Ġa",
- "ff"
- ],
- [
- "Ġn",
- "um"
- ],
- [
- "le",
- "ment"
- ],
- [
- "èī",
- "º"
- ],
- [
- "è",
- "ij"
- ],
- [
- "Ġc",
- "ar"
- ],
- [
- "ag",
- "es"
- ],
- [
- "ab",
- "or"
- ],
- [
- "æĺ¯ä¸Ģ",
- "ç§į"
- ],
- [
- "Ġin",
- "st"
- ],
- [
- "è",
- "Ľ"
- ],
- [
- "ä¹ĭ",
- "ä¸Ģ"
- ],
- [
- "è·",
- "¯"
- ],
- [
- "åį",
- "³"
- ],
- [
- "Ġm",
- "ain"
- ],
- [
- "éļ",
- "ı"
- ],
- [
- "H",
- "ow"
- ],
- [
- "å¿",
- "ħ"
- ],
- [
- "ç¨ĭ",
- "åºı"
- ],
- [
- "éŁ³",
- "ä¹IJ"
- ],
- [
- "re",
- "d"
- ],
- [
- "æ²",
- "¹"
- ],
- [
- "Ġoff",
- "er"
- ],
- [
- "et",
- "s"
- ],
- [
- "ç",
- "¢"
- ],
- [
- "Ġd",
- "uring"
- ],
- [
- "çļĦ",
- "人"
- ],
- [
- "æĽ´",
- "å¤ļ"
- ],
- [
- "Ġd",
- "i"
- ],
- [
- "代",
- "çłģ"
- ],
- [
- "èİ",
- "·"
- ],
- [
- "åħ",
- "ĭ"
- ],
- [
- "Ġgu",
- "id"
- ],
- [
- "主",
- "è¦ģ"
- ],
- [
- "Ġf",
- "am"
- ],
- [
- "æİ",
- "§"
- ],
- [
- "éĢļ",
- "常"
- ],
- [
- "ĠA",
- "d"
- ],
- [
- "å¤Ħ",
- "çIJĨ"
- ],
- [
- "ur",
- "n"
- ],
- [
- "ow",
- "er"
- ],
- [
- "åij",
- "½"
- ],
- [
- "æı",
- "ı"
- ],
- [
- "Ġsk",
- "ills"
- ],
- [
- "Ġto",
- "ol"
- ],
- [
- "w",
- "are"
- ],
- [
- "æĸĩ",
- "æľ¬"
- ],
- [
- "Ġpatter",
- "ns"
- ],
- [
- "缮",
- "æłĩ"
- ],
- [
- "ac",
- "y"
- ],
- [
- "æī",
- "ĵ"
- ],
- [
- "åŁİ",
- "å¸Ĥ"
- ],
- [
- "Ġe",
- "very"
- ],
- [
- "r",
- "ies"
- ],
- [
- "è¯",
- "»"
- ],
- [
- "éģ",
- "¿"
- ],
- [
- "çĻ",
- "½"
- ],
- [
- "éĢĤ",
- "åIJĪ"
- ],
- [
- "Ġpat",
- "ient"
- ],
- [
- "çľ",
- "Ł"
- ],
- [
- "ot",
- "h"
- ],
- [
- "å¥",
- "¹"
- ],
- [
- "åĶ",
- "®"
- ],
- [
- "ä¸Ģ",
- "ç§į"
- ],
- [
- "Ġm",
- "ade"
- ],
- [
- "ä½",
- "İ"
- ],
- [
- "is",
- "e"
- ],
- [
- "Ġre",
- "m"
- ],
- [
- "æ¶",
- "Ī"
- ],
- [
- "åIJ",
- "«"
- ],
- [
- "a",
- "ir"
- ],
- [
- "Ġgen",
- "er"
- ],
- [
- "o",
- "y"
- ],
- [
- "ç²",
- "¾"
- ],
- [
- "æĥħ",
- "åĨµ"
- ],
- [
- "ight",
- "s"
- ],
- [
- "Ġexp",
- "l"
- ],
- [
- "è§",
- "ģ"
- ],
- [
- "Ġpre",
- "dict"
- ],
- [
- "ç±",
- "³"
- ],
- [
- "æĽ´",
- "好"
- ],
- [
- "ä¿",
- "®"
- ],
- [
- "Ġcl",
- "imate"
- ],
- [
- "Ġf",
- "ocus"
- ],
- [
- "Ġg",
- "row"
- ],
- [
- "客",
- "æĪ·"
- ],
- [
- "ä¸į",
- "æĸŃ"
- ],
- [
- "it",
- "or"
- ],
- [
- "ĠE",
- "n"
- ],
- [
- "çº",
- "¦"
- ],
- [
- "æĺ¯",
- "åIJ¦"
- ],
- [
- "ä»",
- "ħ"
- ],
- [
- "æĪij们",
- "çļĦ"
- ],
- [
- "æľ",
- "Ľ"
- ],
- [
- "o",
- "p"
- ],
- [
- "Ġm",
- "aking"
- ],
- [
- "y",
- "th"
- ],
- [
- "cc",
- "ess"
- ],
- [
- "Ġo",
- "wn"
- ],
- [
- "gg",
- "est"
- ],
- [
- "Ġt",
- "as"
- ],
- [
- "ut",
- "ure"
- ],
- [
- "Ġmod",
- "el"
- ],
- [
- "p",
- "ut"
- ],
- [
- "Ġrese",
- "arch"
- ],
- [
- "ere",
- "st"
- ],
- [
- "éļ",
- "¾"
- ],
- [
- "Ġ",
- "["
- ],
- [
- "i",
- "el"
- ],
- [
- "ation",
- "al"
- ],
- [
- "Ġcommun",
- "ic"
- ],
- [
- "ç¥",
- "ŀ"
- ],
- [
- "ç©",
- "¶"
- ],
- [
- "Ġre",
- "st"
- ],
- [
- "æĪIJ",
- "为"
- ],
- [
- "k",
- "ing"
- ],
- [
- "p",
- "r"
- ],
- [
- "åĮ",
- "»"
- ],
- [
- "c",
- "ur"
- ],
- [
- "èĤ",
- "²"
- ],
- [
- "Ġ",
- "'"
- ],
- [
- "è¿Ļ",
- "ç§į"
- ],
- [
- "ç¯",
- "ĩ"
- ],
- [
- "Ġc",
- "he"
- ],
- [
- "ow",
- "n"
- ],
- [
- "éĻ",
- "ħ"
- ],
- [
- "Ġf",
- "in"
- ],
- [
- "åζ",
- "ä½ľ"
- ],
- [
- "Ġsu",
- "ggest"
- ],
- [
- "å¢ŀ",
- "åĬł"
- ],
- [
- "Ġmed",
- "ia"
- ],
- [
- "rib",
- "ut"
- ],
- [
- "çļĦæĥ",
- "ħ"
- ],
- [
- "åĬł",
- "åħ¥"
- ],
- [
- "Ġc",
- "le"
- ],
- [
- "åij",
- "¨"
- ],
- [
- "ç«",
- "ł"
- ],
- [
- "Ġth",
- "ink"
- ],
- [
- "Ġloc",
- "al"
- ],
- [
- "pport",
- "un"
- ],
- [
- "ĠY",
- "ou"
- ],
- [
- "Ġpl",
- "an"
- ],
- [
- "Ġev",
- "en"
- ],
- [
- "éĽ",
- "Ĩ"
- ],
- [
- "å·",
- "§"
- ],
- [
- "a",
- "x"
- ],
- [
- "Ġchalleng",
- "es"
- ],
- [
- "Ġpro",
- "f"
- ],
- [
- "ĠC",
- "an"
- ],
- [
- "Ġconc",
- "er"
- ],
- [
- "Ġf",
- "uture"
- ],
- [
- "åĬ",
- "¿"
- ],
- [
- "Ġre",
- "f"
- ],
- [
- "èģ",
- "Ķ"
- ],
- [
- "Ġs",
- "elf"
- ],
- [
- "æĪĸ",
- "èĢħ"
- ],
- [
- "b",
- "le"
- ],
- [
- "åĽ",
- "´"
- ],
- [
- "è¿IJ",
- "åĬ¨"
- ],
- [
- "Ġin",
- "f"
- ],
- [
- "éĩ",
- "Ĭ"
- ],
- [
- "Ġsustain",
- "able"
- ],
- [
- "Ġte",
- "xt"
- ],
- [
- "Ġg",
- "ra"
- ],
- [
- "äº",
- "Į"
- ],
- [
- "åĵģ",
- "çīĮ"
- ],
- [
- "ä¸įåIJĮ",
- "çļĦ"
- ],
- [
- "l",
- "ed"
- ],
- [
- "çĭ",
- "¬"
- ],
- [
- "Ġo",
- "pportun"
- ],
- [
- "Ġcont",
- "in"
- ],
- [
- "y",
- "m"
- ],
- [
- "Ġg",
- "et"
- ],
- [
- "å¯",
- "Ĩ"
- ],
- [
- "éĻ",
- "¤"
- ],
- [
- "æ",
- "ħ"
- ],
- [
- "éģ¿",
- "åħį"
- ],
- [
- "Ġ",
- "+"
- ],
- [
- "è§",
- "ī"
- ],
- [
- "Ġre",
- "t"
- ],
- [
- "å¸",
- "ĥ"
- ],
- [
- "Ġint",
- "erest"
- ],
- [
- "Ġsoc",
- "iety"
- ],
- [
- "ç»ĵ",
- "æŀľ"
- ],
- [
- "åIJ",
- "¬"
- ],
- [
- "é¦ĸ",
- "åħĪ"
- ],
- [
- "Ġb",
- "re"
- ],
- [
- "Ġ2",
- "0"
- ],
- [
- "ĠHow",
- "ever"
- ],
- [
- "è®",
- "°"
- ],
- [
- "on",
- "s"
- ],
- [
- "è¿",
- "ij"
- ],
- [
- "å¼Ģ",
- "å§ĭ"
- ],
- [
- "Ġbu",
- "ild"
- ],
- [
- "Ġbe",
- "h"
- ],
- [
- "'",
- "m"
- ],
- [
- "v",
- "ers"
- ],
- [
- "Ġg",
- "ood"
- ],
- [
- "çIJĨ",
- "è§£"
- ],
- [
- "res",
- "ent"
- ],
- [
- "ç¦",
- "»"
- ],
- [
- "åĬŁ",
- "èĥ½"
- ],
- [
- "Ġeff",
- "ort"
- ],
- [
- "l",
- "abor"
- ],
- [
- "é»",
- "ij"
- ],
- [
- "Ġbet",
- "ter"
- ],
- [
- "Ġre",
- "ad"
- ],
- [
- "å¾",
- "ĭ"
- ],
- [
- "èĽ",
- "ĭ"
- ],
- [
- "he",
- "d"
- ],
- [
- "ä¹",
- "°"
- ],
- [
- "导",
- "èĩ´"
- ],
- [
- "Ġimp",
- "lement"
- ],
- [
- "ç",
- "¿"
- ],
- [
- "äº",
- "«"
- ],
- [
- "å¤",
- "´"
- ],
- [
- "en",
- "se"
- ],
- [
- "Ġl",
- "ong"
- ],
- [
- "ot",
- "her"
- ],
- [
- "é¥",
- "®"
- ],
- [
- "åŃĺ",
- "åľ¨"
- ],
- [
- "çļĦæ",
- "Ħ"
- ],
- [
- "ä¸Ģ",
- "份"
- ],
- [
- "yth",
- "on"
- ],
- [
- "n",
- "ing"
- ],
- [
- "åĩı",
- "å°ij"
- ],
- [
- "åĢ",
- "Ļ"
- ],
- [
- "ä¸",
- "ĵ"
- ],
- [
- "åIJĦ",
- "ç§į"
- ],
- [
- "è",
- "ħ"
- ],
- [
- "å°",
- "½"
- ],
- [
- "åį",
- "ĩ"
- ],
- [
- "æĬ",
- "¥"
- ],
- [
- "Ġpub",
- "lic"
- ],
- [
- "Ġl",
- "ar"
- ],
- [
- "ä½ł",
- "çļĦ"
- ],
- [
- "a",
- "ut"
- ],
- [
- "é¢Ĩ",
- "åŁŁ"
- ],
- [
- "æ",
- "ļ"
- ],
- [
- "ol",
- "low"
- ],
- [
- "èģ",
- "Į"
- ],
- [
- "Ġch",
- "ang"
- ],
- [
- "Ġb",
- "est"
- ],
- [
- "h",
- "ip"
- ],
- [
- "åĨ",
- "į"
- ],
- [
- "ak",
- "es"
- ],
- [
- "Ġch",
- "at"
- ],
- [
- "it",
- "ed"
- ],
- [
- "Ġp",
- "ower"
- ],
- [
- "ä¿Ŀ",
- "æĬ¤"
- ],
- [
- "ä¹",
- "¦"
- ],
- [
- "计",
- "åĪĴ"
- ],
- [
- "éĩįè¦ģ",
- "çļĦ"
- ],
- [
- "åıĺ",
- "åĮĸ"
- ],
- [
- "il",
- "ities"
- ],
- [
- "Ġcons",
- "ider"
- ],
- [
- "æĪij们",
- "åı¯ä»¥"
- ],
- [
- "éĤ£",
- "ä¹Ī"
- ],
- [
- "Ġ",
- "ide"
- ],
- [
- "æ¼",
- "Ķ"
- ],
- [
- "ag",
- "ing"
- ],
- [
- "Ġb",
- "ased"
- ],
- [
- "å®",
- "Ŀ"
- ],
- [
- "Ġr",
- "ange"
- ],
- [
- "Ġres",
- "ult"
- ],
- [
- "Ġm",
- "em"
- ],
- [
- "çħ",
- "§"
- ],
- [
- "Ġle",
- "vel"
- ],
- [
- "c",
- "ou"
- ],
- [
- "Ġb",
- "r"
- ],
- [
- "T",
- "h"
- ],
- [
- "ä¼",
- "ģ"
- ],
- [
- "建",
- "ç«ĭ"
- ],
- [
- "Ġun",
- "ique"
- ],
- [
- "è®",
- "Ń"
- ],
- [
- "Ġm",
- "ark"
- ],
- [
- "许",
- "å¤ļ"
- ],
- [
- "è¡Į",
- "为"
- ],
- [
- "Ķ",
- "ç©¶"
- ],
- [
- "çļĦæ",
- "Ĭ"
- ],
- [
- "Ġs",
- "et"
- ],
- [
- "éª",
- "¤"
- ],
- [
- "t",
- "s"
- ],
- [
- "Ġh",
- "ist"
- ],
- [
- "Ġa",
- "round"
- ],
- [
- "Ġre",
- "v"
- ],
- [
- "åħ¶",
- "ä¸Ń"
- ],
- [
- "ï¼",
- "ģ"
- ],
- [
- "æıı",
- "è¿°"
- ],
- [
- "æľĢ",
- "åIJİ"
- ],
- [
- "Ġs",
- "im"
- ],
- [
- "n",
- "ect"
- ],
- [
- "åĽŀ",
- "çŃĶ"
- ],
- [
- "éĺ",
- "²"
- ],
- [
- "èī",
- "¯"
- ],
- [
- "åΰ",
- "äºĨ"
- ],
- [
- "ä¸",
- "ĸçķ"
- ],
- [
- "æĸ¹",
- "æ¡Ī"
- ],
- [
- "æĿIJ",
- "æĸĻ"
- ],
- [
- "ä¸ĸçķ",
- "Į"
- ],
- [
- "æĽ´å¥½",
- "åľ°"
- ],
- [
- "两",
- "个"
- ],
- [
- "Ġem",
- "ploy"
- ],
- [
- "Ġtr",
- "y"
- ],
- [
- "æ",
- "ĵ"
- ],
- [
- "Ġb",
- "ack"
- ],
- [
- "åĪ",
- "ĩ"
- ],
- [
- "Ġsu",
- "ccess"
- ],
- [
- "Ġdecis",
- "ions"
- ],
- [
- "Ġth",
- "ose"
- ],
- [
- "å¯",
- "Į"
- ],
- [
- "Ġf",
- "act"
- ],
- [
- "æİ",
- "¢"
- ],
- [
- "è¶",
- "£"
- ],
- [
- "Ġpract",
- "ices"
- ],
- [
- "åIJ",
- "Ĺ"
- ],
- [
- "æī",
- "į"
- ],
- [
- "çİ",
- "©"
- ],
- [
- "pt",
- "ion"
- ],
- [
- "æĸĩ",
- "竳"
- ],
- [
- "Ġfe",
- "at"
- ],
- [
- "Ġpre",
- "vent"
- ],
- [
- "Ġwrit",
- "ing"
- ],
- [
- "çļĦæ",
- "Ģ"
- ],
- [
- "Ġn",
- "o"
- ],
- [
- "ä»",
- "ĭ"
- ],
- [
- "éĹ",
- "¨"
- ],
- [
- "Ġd",
- "el"
- ],
- [
- "æ",
- "Ĵ"
- ],
- [
- "Ġopt",
- "im"
- ],
- [
- "in",
- "ation"
- ],
- [
- "Ġ",
- "Ċ"
- ],
- [
- "us",
- "ion"
- ],
- [
- "Ġacc",
- "ount"
- ],
- [
- "l",
- "ing"
- ],
- [
- "Ġd",
- "ivers"
- ],
- [
- ".",
- "\""
- ],
- [
- "at",
- "h"
- ],
- [
- "èĭ",
- "±"
- ],
- [
- "ä¼ģ",
- "ä¸ļ"
- ],
- [
- "Ġg",
- "rou"
- ],
- [
- "åľ°",
- "çIJĥ"
- ],
- [
- "å¤",
- "±"
- ],
- [
- "Ġpersonal",
- "ized"
- ],
- [
- "ĠH",
- "e"
- ],
- [
- "表",
- "è¾¾"
- ],
- [
- "cur",
- "ity"
- ],
- [
- "Ġf",
- "ollow"
- ],
- [
- "产",
- "çĶŁ"
- ],
- [
- "Ġe",
- "ar"
- ],
- [
- "åİ",
- "ĭ"
- ],
- [
- "ver",
- "n"
- ],
- [
- "Ġiss",
- "ues"
- ],
- [
- "åĿ",
- "ĩ"
- ],
- [
- "é",
- "²"
- ],
- [
- "Ġd",
- "r"
- ],
- [
- "iv",
- "ing"
- ],
- [
- "Ġtrain",
- "ing"
- ],
- [
- "Ġris",
- "k"
- ],
- [
- "åĩ",
- "½"
- ],
- [
- "åı",
- "²"
- ],
- [
- "æ",
- "ij"
- ],
- [
- "çļĦæĹ",
- "¶"
- ],
- [
- "og",
- "n"
- ],
- [
- "Ġrequ",
- "ire"
- ],
- [
- "Ġenvironment",
- "al"
- ],
- [
- "b",
- "ack"
- ],
- [
- "éĶ",
- "®"
- ],
- [
- "çĸ",
- "Ĺ"
- ],
- [
- "Ġinter",
- "act"
- ],
- [
- "åĽ¢",
- "éĺŁ"
- ],
- [
- "æ¯ı",
- "个"
- ],
- [
- "çĦ¶",
- "åIJİ"
- ],
- [
- "Ġd",
- "ist"
- ],
- [
- "ç͍",
- "äºİ"
- ],
- [
- "认",
- "为"
- ],
- [
- "åĩ½",
- "æķ°"
- ],
- [
- "Ġs",
- "ent"
- ],
- [
- "Ċ",
- "ĠĠĠĠĠĠĠĠ"
- ],
- [
- "Ġredu",
- "cing"
- ],
- [
- "å¹",
- "²"
- ],
- [
- "Ġre",
- "p"
- ],
- [
- "Ġc",
- "aus"
- ],
- [
- "Ġmus",
- "ic"
- ],
- [
- "ç",
- "ª"
- ],
- [
- "Ġmon",
- "itor"
- ],
- [
- "Ġfor",
- "m"
- ],
- [
- "é¢",
- "ľ"
- ],
- [
- "çĹ",
- "ħ"
- ],
- [
- "é¦",
- "Ļ"
- ],
- [
- "Ġof",
- "ten"
- ],
- [
- "åı¯èĥ½",
- "ä¼ļ"
- ],
- [
- "åijĺ",
- "å·¥"
- ],
- [
- "Ġha",
- "nd"
- ],
- [
- "æĬ",
- "ķ"
- ],
- [
- "Ġneed",
- "s"
- ],
- [
- "æŃ¤",
- "å¤ĸ"
- ],
- [
- "åı",
- "ĭ"
- ],
- [
- "iv",
- "ity"
- ],
- [
- "Ġactiv",
- "ities"
- ],
- [
- "åĸľ",
- "欢"
- ],
- [
- "Ġp",
- "ur"
- ],
- [
- "i",
- "an"
- ],
- [
- "s",
- "elf"
- ],
- [
- "åĬ¨",
- "çī©"
- ],
- [
- "com",
- "es"
- ],
- [
- "å",
- "©"
- ],
- [
- "Ġpr",
- "iv"
- ],
- [
- "a",
- "z"
- ],
- [
- "Ġrel",
- "ations"
- ],
- [
- "Ġmach",
- "ine"
- ],
- [
- "çļĦæ",
- "°"
- ],
- [
- "ä»·",
- "æł¼"
- ],
- [
- "ä»·",
- "å̼"
- ],
- [
- "ç´",
- "¢"
- ],
- [
- "Ġfe",
- "ed"
- ],
- [
- "ä¸Ģ",
- "ä¸ĭ"
- ],
- [
- "Ġte",
- "am"
- ],
- [
- "Ġindust",
- "ry"
- ],
- [
- "è´",
- "¢"
- ],
- [
- "ĠP",
- "ro"
- ],
- [
- "Ġw",
- "ant"
- ],
- [
- "ç§",
- "°"
- ],
- [
- "Ġcl",
- "ass"
- ],
- [
- "Ġlo",
- "ve"
- ],
- [
- "åħ³",
- "äºİ"
- ],
- [
- "è¾ĵ",
- "åħ¥"
- ],
- [
- "Ġtrans",
- "port"
- ],
- [
- "Ġcomple",
- "x"
- ],
- [
- "Ġy",
- "ear"
- ],
- [
- "éĶĢ",
- "åĶ®"
- ],
- [
- "å¯",
- "»"
- ],
- [
- "i",
- "ence"
- ],
- [
- "ist",
- "s"
- ],
- [
- "æĶ¯",
- "æĮģ"
- ],
- [
- "Ġm",
- "ind"
- ],
- [
- "Ġf",
- "un"
- ],
- [
- "Ġch",
- "ar"
- ],
- [
- "æĮ",
- "ī"
- ],
- [
- "Ġconcer",
- "ns"
- ],
- [
- "con",
- "om"
- ],
- [
- "ç®Ģ",
- "åįķ"
- ],
- [
- "以ä¸ĭ",
- "æĺ¯"
- ],
- [
- "Ġst",
- "art"
- ],
- [
- "å¹¶",
- "ä¸Ķ"
- ],
- [
- "av",
- "i"
- ],
- [
- "ä¸Ń",
- "åĽ½"
- ],
- [
- "åħĥ",
- "ç´ł"
- ],
- [
- "Ġcon",
- "f"
- ],
- [
- "Ġpos",
- "itive"
- ],
- [
- "Ġc",
- "ur"
- ],
- [
- "Ġc",
- "ount"
- ],
- [
- "er",
- "y"
- ],
- [
- "å",
- "¡"
- ],
- [
- "å®",
- "¤"
- ],
- [
- "Ġco",
- "st"
- ],
- [
- "Ġe",
- "qu"
- ],
- [
- "Ġpol",
- "ic"
- ],
- [
- "ast",
- "e"
- ],
- [
- "a",
- "w"
- ],
- [
- "éħ",
- "Ĵ"
- ],
- [
- "cou",
- "ra"
- ],
- [
- "iv",
- "en"
- ],
- [
- "pl",
- "ace"
- ],
- [
- "ch",
- "ie"
- ],
- [
- "çļĦæķ",
- "°"
- ],
- [
- "åĽł",
- "ç´ł"
- ],
- [
- "Ġf",
- "l"
- ],
- [
- "is",
- "m"
- ],
- [
- "Ġmed",
- "ical"
- ],
- [
- "Ġhum",
- "ans"
- ],
- [
- "Ġaut",
- "om"
- ],
- [
- "ertain",
- "ly"
- ],
- [
- "Ġ",
- "0"
- ],
- [
- "Ġoff",
- "ers"
- ],
- [
- "Ġdet",
- "ect"
- ],
- [
- "Ġ",
- "6"
- ],
- [
- "é£İ",
- "æł¼"
- ],
- [
- "Ġsh",
- "ow"
- ],
- [
- "çģ",
- "«"
- ],
- [
- "Ġan",
- "im"
- ],
- [
- "é¢ľ",
- "èī²"
- ],
- [
- "le",
- "ase"
- ],
- [
- "a",
- "ve"
- ],
- [
- "åĵ",
- "ª"
- ],
- [
- "ĠThe",
- "re"
- ],
- [
- "以",
- "ä¸Ĭ"
- ],
- [
- "æľª",
- "æĿ¥"
- ],
- [
- "X",
- "X"
- ],
- [
- "çī",
- "ĩ"
- ],
- [
- "u",
- "ch"
- ],
- [
- "Ġtas",
- "ks"
- ],
- [
- "åħ·",
- "ä½ĵ"
- ],
- [
- "æ¤į",
- "çī©"
- ],
- [
- "Ġm",
- "in"
- ],
- [
- "èīº",
- "æľ¯"
- ],
- [
- "ic",
- "ult"
- ],
- [
- "Ġexperi",
- "ences"
- ],
- [
- "æİ§",
- "åζ"
- ],
- [
- "b",
- "e"
- ],
- [
- "Ġpat",
- "ients"
- ],
- [
- "å",
- "²"
- ],
- [
- "ĠW",
- "e"
- ],
- [
- "Ġrec",
- "ogn"
- ],
- [
- "çĥ",
- "¤"
- ],
- [
- "Ġsm",
- "all"
- ],
- [
- "åĿ",
- "Ĺ"
- ],
- [
- "å",
- "Ħ"
- ],
- [
- "太",
- "éĺ³"
- ],
- [
- "ct",
- "ion"
- ],
- [
- "Ġ",
- "ent"
- ],
- [
- "æį",
- "¢"
- ],
- [
- "Ġbe",
- "fore"
- ],
- [
- "Ġbe",
- "come"
- ],
- [
- "å·²",
- "ç»ı"
- ],
- [
- "表",
- "çݰ"
- ],
- [
- "Ġexp",
- "lo"
- ],
- [
- "Ġa",
- "chie"
- ],
- [
- "ä»»",
- "åĬ¡"
- ],
- [
- "大",
- "çļĦ"
- ],
- [
- "Ġd",
- "ay"
- ],
- [
- "Ġf",
- "ound"
- ],
- [
- "å±",
- "±"
- ],
- [
- "on",
- "d"
- ],
- [
- "Ġtreat",
- "ment"
- ],
- [
- "pe",
- "nd"
- ],
- [
- "he",
- "n"
- ],
- [
- "Ġcon",
- "dit"
- ],
- [
- "ç¡®",
- "å®ļ"
- ],
- [
- "Ġbusiness",
- "es"
- ],
- [
- "ĠW",
- "h"
- ],
- [
- "æīĢ",
- "æľī"
- ],
- [
- "Ġdevelop",
- "ed"
- ],
- [
- "ç»",
- "Ī"
- ],
- [
- "æŃ¥",
- "骤"
- ],
- [
- "Ġdiff",
- "icult"
- ],
- [
- "åı",
- "·"
- ],
- [
- "ĠR",
- "e"
- ],
- [
- "éĶ",
- "Ļ"
- ],
- [
- "Ġch",
- "o"
- ],
- [
- "Ġqu",
- "est"
- ],
- [
- "Ġtrans",
- "pare"
- ],
- [
- "Ġpro",
- "ject"
- ],
- [
- "Ġcommun",
- "ity"
- ],
- [
- "o",
- "v"
- ],
- [
- "å¸",
- "Ī"
- ],
- [
- "å¼",
- "ł"
- ],
- [
- "åĪĨ",
- "ç±»"
- ],
- [
- "人",
- "çļĦ"
- ],
- [
- "s",
- "is"
- ],
- [
- "çĽ",
- "Ĭ"
- ],
- [
- "o",
- "id"
- ],
- [
- "ĠA",
- "n"
- ],
- [
- "w",
- "ays"
- ],
- [
- "Ġe",
- "as"
- ],
- [
- "Ġaff",
- "ect"
- ],
- [
- "Ġother",
- "s"
- ],
- [
- "Ġreg",
- "ul"
- ],
- [
- "æĢ§",
- "åĴĮ"
- ],
- [
- "åĸ",
- "Ħ"
- ],
- [
- "ag",
- "n"
- ],
- [
- "ä½ľ",
- "为"
- ],
- [
- "åı¯ä»¥",
- "帮åĬ©"
- ],
- [
- "åĦ",
- "¿"
- ],
- [
- "Ġorganiz",
- "ations"
- ],
- [
- "é¸",
- "¡"
- ],
- [
- "åħ",
- "´"
- ],
- [
- "Ġf",
- "riend"
- ],
- [
- "Ġ",
- "$"
- ],
- [
- "Ġdet",
- "ail"
- ],
- [
- "Ġtra",
- "ditional"
- ],
- [
- "Ġdesign",
- "ed"
- ],
- [
- "è´Ń",
- "ä¹°"
- ],
- [
- "ä½ĵ",
- "éªĮ"
- ],
- [
- "ç»",
- "į"
- ],
- [
- "er",
- "m"
- ],
- [
- "Ġcon",
- "nect"
- ],
- [
- "è¿Ļ",
- "æł·"
- ],
- [
- "Ġrecommend",
- "ations"
- ],
- [
- "Ġb",
- "oth"
- ],
- [
- "Ł",
- "éĢļ"
- ],
- [
- "æ¯",
- "į"
- ],
- [
- "Ġs",
- "it"
- ],
- [
- "ä½ľ",
- "ç͍"
- ],
- [
- "ä»ĭ",
- "ç»į"
- ],
- [
- "Ġst",
- "e"
- ],
- [
- "ĠS",
- "ure"
- ],
- [
- "åı",
- "°"
- ],
- [
- "æĤ¨",
- "çļĦ"
- ],
- [
- "Ġs",
- "he"
- ],
- [
- "Ġman",
- "agement"
- ],
- [
- "j",
- "oy"
- ],
- [
- "è´",
- "Ł"
- ],
- [
- "Ġpromot",
- "e"
- ],
- [
- "Ġvari",
- "ous"
- ],
- [
- "(",
- "\""
- ],
- [
- "p",
- "or"
- ],
- [
- "Ġs",
- "ens"
- ],
- [
- "Ġess",
- "ential"
- ],
- [
- "get",
- "her"
- ],
- [
- "ular",
- "ly"
- ],
- [
- "äº",
- "ī"
- ],
- [
- "ir",
- "st"
- ],
- [
- "Ġo",
- "p"
- ],
- [
- "Ġspec",
- "ies"
- ],
- [
- "çݰ",
- "åľ¨"
- ],
- [
- "ch",
- "o"
- ],
- [
- "Ġbeh",
- "avi"
- ],
- [
- "çŃ",
- "ij"
- ],
- [
- "å¥",
- "³"
- ],
- [
- "Ġqu",
- "ality"
- ],
- [
- "Ġex",
- "t"
- ],
- [
- "è",
- "¥"
- ],
- [
- "å®Į",
- "æĪIJ"
- ],
- [
- "æĢ»",
- "ä¹ĭ"
- ],
- [
- "éĥ¨",
- "åĪĨ"
- ],
- [
- "ä»İ",
- "èĢĮ"
- ],
- [
- "åĽ",
- "¾"
- ],
- [
- "Ġty",
- "p"
- ],
- [
- "Ġstr",
- "ate"
- ],
- [
- "è¥",
- "¿"
- ],
- [
- "Ġhe",
- "re"
- ],
- [
- "ar",
- "s"
- ],
- [
- "å¸",
- "Į"
- ],
- [
- "çļĦæ",
- "Ŀ"
- ],
- [
- "å°",
- "Ŀ"
- ],
- [
- "e",
- "e"
- ],
- [
- "i",
- "er"
- ],
- [
- "Ġe",
- "c"
- ],
- [
- "ical",
- "ly"
- ],
- [
- "er",
- "ing"
- ],
- [
- "å¿",
- "µ"
- ],
- [
- "ĠD",
- "e"
- ],
- [
- "Ġne",
- "g"
- ],
- [
- "建",
- "çŃij"
- ],
- [
- "Ġserv",
- "ices"
- ],
- [
- "Ġab",
- "le"
- ],
- [
- "im",
- "es"
- ],
- [
- "Ġopt",
- "ions"
- ],
- [
- "缸",
- "åħ³"
- ],
- [
- "Ġsu",
- "b"
- ],
- [
- "Ġdecis",
- "ion"
- ],
- [
- "ĠC",
- "ertainly"
- ],
- [
- "Ġ",
- "åľ¨"
- ],
- [
- "æ",
- "¢"
- ],
- [
- "Ġserv",
- "ice"
- ],
- [
- ")",
- ":"
- ],
- [
- "带",
- "æĿ¥"
- ],
- [
- "Ġch",
- "ild"
- ],
- [
- "è§£",
- "éĩĬ"
- ],
- [
- "ir",
- "t"
- ],
- [
- "ç",
- "Ĩ"
- ],
- [
- "ä¸į",
- "ä»ħ"
- ],
- [
- "æĿ",
- "¾"
- ],
- [
- "积",
- "æŀģ"
- ],
- [
- "r",
- "on"
- ],
- [
- "åı",
- "¤"
- ],
- [
- "çł",
- "Ķç©¶"
- ],
- [
- "ç²",
- "ī"
- ],
- [
- "h",
- "or"
- ],
- [
- "Ġprof",
- "ess"
- ],
- [
- "çļĦ",
- "éĹ®é¢ĺ"
- ],
- [
- "Ġopportun",
- "ities"
- ],
- [
- "åİĨ",
- "åı²"
- ],
- [
- "Ġde",
- "f"
- ],
- [
- "ĠA",
- "m"
- ],
- [
- "Ġg",
- "r"
- ],
- [
- "a",
- "ur"
- ],
- [
- "å±",
- "Ĥ"
- ],
- [
- "çŃ",
- "ĸ"
- ],
- [
- "Ġpop",
- "ular"
- ],
- [
- "æ´",
- "ģ"
- ],
- [
- "åıij",
- "çݰ"
- ],
- [
- "Ġpo",
- "em"
- ],
- [
- "èµ",
- "Ľ"
- ],
- [
- "Ġo",
- "b"
- ],
- [
- "Ġd",
- "on"
- ],
- [
- "Ġs",
- "ound"
- ],
- [
- "Ġtransport",
- "ation"
- ],
- [
- "i",
- "ous"
- ],
- [
- "åı",
- "¦"
- ],
- [
- "Ġro",
- "le"
- ],
- [
- "Ġf",
- "iel"
- ],
- [
- "ç§ij",
- "åѦ"
- ],
- [
- "èĢ",
- "ģ"
- ],
- [
- "re",
- "en"
- ],
- [
- "æľī",
- "æķĪ"
- ],
- [
- "Ġc",
- "or"
- ],
- [
- "Ġfeed",
- "back"
- ],
- [
- "Ġtechnolo",
- "gies"
- ],
- [
- "交",
- "éĢļ"
- ],
- [
- "Ġad",
- "apt"
- ],
- [
- "'",
- "re"
- ],
- [
- "erv",
- "ation"
- ],
- [
- "Ġcommun",
- "ities"
- ],
- [
- "çݰ",
- "代"
- ],
- [
- "Ġlo",
- "ok"
- ],
- [
- "Ġf",
- "ac"
- ],
- [
- "ç͵",
- "å½±"
- ],
- [
- "Ġcol",
- "lect"
- ],
- [
- "å¾Ĺ",
- "åΰ"
- ],
- [
- "h",
- "ips"
- ],
- [
- "Ġav",
- "ail"
- ],
- [
- "ere",
- "n"
- ],
- [
- "ä¸Ģ",
- "èµ·"
- ],
- [
- "çī",
- "Ľ"
- ],
- [
- "Ġpos",
- "s"
- ],
- [
- "Ġwe",
- "ather"
- ],
- [
- "Ġeffort",
- "s"
- ],
- [
- "¿",
- "Ģ"
- ],
- [
- "æĹ",
- "ħ"
- ],
- [
- "o",
- "h"
- ],
- [
- "Ġcol",
- "labor"
- ],
- [
- "æĭ",
- "¥"
- ],
- [
- "æĪIJ",
- "åĬŁ"
- ],
- [
- "èİ·",
- "å¾Ĺ"
- ],
- [
- "å±",
- "ħ"
- ],
- [
- "Ġt",
- "re"
- ],
- [
- "Ġs",
- "ources"
- ],
- [
- "Ġstud",
- "y"
- ],
- [
- "Ġprogra",
- "ms"
- ],
- [
- "éĻ",
- "IJ"
- ],
- [
- "Ġt",
- "ips"
- ],
- [
- "Ġmark",
- "et"
- ],
- [
- "al",
- "ly"
- ],
- [
- "å®",
- "³"
- ],
- [
- "w",
- "ards"
- ],
- [
- "æ£",
- "Ģ"
- ],
- [
- "ä¸Ģ",
- "ç¯ĩ"
- ],
- [
- "ri",
- "or"
- ],
- [
- "Ġto",
- "p"
- ],
- [
- "Ġe",
- "nd"
- ],
- [
- "å",
- "ĭ"
- ],
- [
- "Ġlar",
- "ge"
- ],
- [
- "ici",
- "ency"
- ],
- [
- "Ġde",
- "c"
- ],
- [
- "å®ļ",
- "çļĦ"
- ],
- [
- "ic",
- "ient"
- ],
- [
- "è¿ĩç¨ĭ",
- "ä¸Ń"
- ],
- [
- "lic",
- "ations"
- ],
- [
- "ç¼",
- "º"
- ],
- [
- "Ġto",
- "ur"
- ],
- [
- "Ġto",
- "gether"
- ],
- [
- "人",
- "å·¥"
- ],
- [
- "Ġtool",
- "s"
- ],
- [
- "æĸ",
- "¯"
- ],
- [
- "æ°",
- "ij"
- ],
- [
- "æĬ",
- "Ĭ"
- ],
- [
- "ä¹ĭéĹ´",
- "çļĦ"
- ],
- [
- "çī¹",
- "çĤ¹"
- ],
- [
- "Ġbe",
- "l"
- ],
- [
- "ditional",
- "ly"
- ],
- [
- "åĪ©",
- "ç͍"
- ],
- [
- "è¾",
- "¹"
- ],
- [
- "éĻ",
- "į"
- ],
- [
- "ĠI",
- "f"
- ],
- [
- "é¢",
- "Ŀ"
- ],
- [
- "åį",
- "ı"
- ],
- [
- "å¾",
- "Ģ"
- ],
- [
- "l",
- "ish"
- ],
- [
- "è¯",
- "ī"
- ],
- [
- "in",
- "s"
- ],
- [
- "å¥",
- "¶"
- ],
- [
- "Ġe",
- "conom"
- ],
- [
- "Ġinv",
- "est"
- ],
- [
- "ĠD",
- "o"
- ],
- [
- "t",
- "ain"
- ],
- [
- "åĩº",
- "çݰ"
- ],
- [
- "çļĦ",
- "å½±åĵį"
- ],
- [
- "ater",
- "ial"
- ],
- [
- "Ġs",
- "ure"
- ],
- [
- "Ġp",
- "ass"
- ],
- [
- "çĶ",
- "»"
- ],
- [
- "è´",
- "£"
- ],
- [
- "ç»ĵ",
- "æŀĦ"
- ],
- [
- "æķ",
- "ħ"
- ],
- [
- "æĥħ",
- "æĦŁ"
- ],
- [
- "æ",
- "¿Ģ"
- ],
- [
- "ell",
- "ig"
- ],
- [
- "ä¼",
- "Ĺ"
- ],
- [
- "æ¯Ķ",
- "è¾ĥ"
- ],
- [
- "ter",
- "n"
- ],
- [
- "Ġout",
- "comes"
- ],
- [
- "u",
- "p"
- ],
- [
- "Ġbe",
- "aut"
- ],
- [
- "re",
- "ad"
- ],
- [
- "çĶŁ",
- "æĪIJ"
- ],
- [
- "æķ°",
- "åŃĹ"
- ],
- [
- "Ġde",
- "m"
- ],
- [
- "i",
- "res"
- ],
- [
- "åı¯ä»¥",
- "éĢļè¿ĩ"
- ],
- [
- "æĸ°",
- "çļĦ"
- ],
- [
- "Ġde",
- "ep"
- ],
- [
- "å",
- "¨"
- ],
- [
- "çĭ",
- "Ĺ"
- ],
- [
- "åħ³",
- "注"
- ],
- [
- "çĶŁ",
- "åij½"
- ],
- [
- "ä¼ł",
- "绣"
- ],
- [
- "Ġst",
- "ay"
- ],
- [
- "æŃ",
- "Į"
- ],
- [
- "åħ³",
- "éĶ®"
- ],
- [
- "Ġpl",
- "ace"
- ],
- [
- "主",
- "é¢ĺ"
- ],
- [
- "å¾Ī",
- "å¤ļ"
- ],
- [
- "èĪ",
- "Ĵ"
- ],
- [
- "Ġprofess",
- "ional"
- ],
- [
- "y",
- "le"
- ],
- [
- "æĽ",
- "²"
- ],
- [
- "1",
- "9"
- ],
- [
- "Ġess",
- "ay"
- ],
- [
- "Ġg",
- "ive"
- ],
- [
- "ç³",
- "ĸ"
- ],
- [
- "Ġon",
- "ly"
- ],
- [
- "æŁ",
- "IJ"
- ],
- [
- "Ġph",
- "ys"
- ],
- [
- "对",
- "è¯Ŀ"
- ],
- [
- "Ġcont",
- "ro"
- ],
- [
- "Ġam",
- "ount"
- ],
- [
- "ce",
- "pt"
- ],
- [
- "iz",
- "ation"
- ],
- [
- "ç¼ĸ",
- "åĨĻ"
- ],
- [
- "åıĹ",
- "åΰ"
- ],
- [
- "Ġal",
- "ways"
- ],
- [
- "æ¯Ķ",
- "å¦Ĥ"
- ],
- [
- "Ġpriv",
- "acy"
- ],
- [
- "a",
- "u"
- ],
- [
- "____",
- "____"
- ],
- [
- "Ġrespons",
- "ible"
- ],
- [
- "(",
- ")"
- ],
- [
- "çŃī",
- "çŃī"
- ],
- [
- "Ġm",
- "aterial"
- ],
- [
- "Ġon",
- "line"
- ],
- [
- "é",
- "¼"
- ],
- [
- "æĶ",
- "¿"
- ],
- [
- "åĽ",
- "Ľ"
- ],
- [
- "Ġen",
- "joy"
- ],
- [
- "åľ",
- "Ł"
- ],
- [
- "Ġsaf",
- "ety"
- ],
- [
- "Ġt",
- "w"
- ],
- [
- "Ġcommunic",
- "ation"
- ],
- [
- "ä¸",
- "½"
- ],
- [
- "æĺ",
- "¾"
- ],
- [
- "olut",
- "ion"
- ],
- [
- "er",
- "g"
- ],
- [
- "į",
- "ä½ľ"
- ],
- [
- "Ġus",
- "er"
- ],
- [
- "Ġemot",
- "ional"
- ],
- [
- "t",
- "ime"
- ],
- [
- "é",
- "¾"
- ],
- [
- "Ġse",
- "curity"
- ],
- [
- "Ġs",
- "ense"
- ],
- [
- "el",
- "ines"
- ],
- [
- "åĬ",
- "±"
- ],
- [
- "çī©",
- "è´¨"
- ],
- [
- "u",
- "ra"
- ],
- [
- "Ġsh",
- "are"
- ],
- [
- "Ġanalyz",
- "ing"
- ],
- [
- "it",
- "al"
- ],
- [
- "é",
- "±"
- ],
- [
- "irt",
- "ual"
- ],
- [
- "Ġvis",
- "it"
- ],
- [
- "b",
- "ers"
- ],
- [
- "Ġc",
- "our"
- ],
- [
- "Ġpro",
- "ble"
- ],
- [
- "设",
- "å¤ĩ"
- ],
- [
- "at",
- "ch"
- ],
- [
- "l",
- "and"
- ],
- [
- "é±",
- "¼"
- ],
- [
- "æĪij们",
- "éľĢè¦ģ"
- ],
- [
- "ç¨",
- "³"
- ],
- [
- "ib",
- "ility"
- ],
- [
- "Ġeff",
- "iciency"
- ],
- [
- "å£",
- "°"
- ],
- [
- "è",
- "Ĵ"
- ],
- [
- "æľº",
- "åύ"
- ],
- [
- "Ġcle",
- "ar"
- ],
- [
- "åζ",
- "å®ļ"
- ],
- [
- "iz",
- "ing"
- ],
- [
- "Ġcondit",
- "ions"
- ],
- [
- "l",
- "usion"
- ],
- [
- "Ġlo",
- "w"
- ],
- [
- "Ġl",
- "im"
- ],
- [
- "her",
- "s"
- ],
- [
- "Ġris",
- "ks"
- ],
- [
- "ç¿",
- "»"
- ],
- [
- "Ġle",
- "t"
- ],
- [
- "åĴ",
- "ĸ"
- ],
- [
- "å¿ĥ",
- "çIJĨ"
- ],
- [
- "è¿",
- "ľ"
- ],
- [
- "pr",
- "int"
- ],
- [
- "Ġchang",
- "es"
- ],
- [
- "Ġme",
- "as"
- ],
- [
- "Ġimpro",
- "ving"
- ],
- [
- "Ġc",
- "rit"
- ],
- [
- "5",
- "0"
- ],
- [
- "å¸Į",
- "æľĽ"
- ],
- [
- "Ġa",
- "ud"
- ],
- [
- "åį",
- "Ĺ"
- ],
- [
- "æĹł",
- "æ³ķ"
- ],
- [
- "Ġneg",
- "ative"
- ],
- [
- "项",
- "缮"
- ],
- [
- "u",
- "nd"
- ],
- [
- "at",
- "s"
- ],
- [
- "Ġcompan",
- "ies"
- ],
- [
- "æī¾",
- "åΰ"
- ],
- [
- "Ġcont",
- "ribut"
- ],
- [
- "æŃ£",
- "ç¡®"
- ],
- [
- "é»",
- "Ħ"
- ],
- [
- "å±",
- "ŀ"
- ],
- [
- "Ġunderstand",
- "ing"
- ],
- [
- "Ġm",
- "ult"
- ],
- [
- "Ġc",
- "lo"
- ],
- [
- "å¾",
- "ģ"
- ],
- [
- "Ġp",
- "rior"
- ],
- [
- "r",
- "im"
- ],
- [
- "人工",
- "æĻºèĥ½"
- ],
- [
- "Ġvari",
- "ety"
- ],
- [
- "Ġt",
- "aking"
- ],
- [
- "å",
- "Ĥ"
- ],
- [
- "as",
- "ter"
- ],
- [
- "od",
- "y"
- ],
- [
- "Ġ",
- "{"
- ],
- [
- "çļĦ",
- "éĩįè¦ģ"
- ],
- [
- "Ġf",
- "ore"
- ],
- [
- "èµĦ",
- "æºIJ"
- ],
- [
- "è¦ģ",
- "æ±Ĥ"
- ],
- [
- "Ġfeat",
- "ures"
- ],
- [
- "èį",
- "ī"
- ],
- [
- "m",
- "e"
- ],
- [
- "èĮ",
- "ĥ"
- ],
- [
- "Ġo",
- "per"
- ],
- [
- "çº",
- "§"
- ],
- [
- "é²",
- "ľ"
- ],
- [
- "æĬĢ",
- "å·§"
- ],
- [
- "ij",
- "æĪĺ"
- ],
- [
- "ç±»",
- "åŀĭ"
- ],
- [
- "æĿ",
- "¿"
- ],
- [
- "è½",
- "¯"
- ],
- [
- "e",
- "w"
- ],
- [
- "Ġrest",
- "aur"
- ],
- [
- "Ġwith",
- "out"
- ],
- [
- "ruct",
- "ure"
- ],
- [
- "çļĦ",
- "æĺ¯"
- ],
- [
- "ç",
- "ı"
- ],
- [
- "Ġl",
- "ist"
- ],
- [
- "ur",
- "ate"
- ],
- [
- "Ġbo",
- "ok"
- ],
- [
- "äº",
- "²"
- ],
- [
- "åº",
- "Ĺ"
- ],
- [
- "ä¹Ł",
- "æĺ¯"
- ],
- [
- "ä»»",
- "ä½ķ"
- ],
- [
- "Ġc",
- "am"
- ],
- [
- "ĠB",
- "e"
- ],
- [
- "Ġgo",
- "vern"
- ],
- [
- "Ġbehavi",
- "or"
- ],
- [
- "è®Ń",
- "ç»ĥ"
- ],
- [
- "Ġfam",
- "ily"
- ],
- [
- "æĿ",
- "Ĥ"
- ],
- [
- "Ġc",
- "ity"
- ],
- [
- "Ġappro",
- "ach"
- ],
- [
- "Ġacc",
- "urate"
- ],
- [
- "Ġs",
- "om"
- ],
- [
- "Ġe",
- "l"
- ],
- [
- "èĪ",
- "ŀ"
- ],
- [
- "è",
- "ŀ"
- ],
- [
- "åŁº",
- "æľ¬"
- ],
- [
- "Ġdis",
- "e"
- ],
- [
- "Ġen",
- "coura"
- ],
- [
- "ĠW",
- "hat"
- ],
- [
- "å",
- "ĥ"
- ],
- [
- "è¯",
- "¦"
- ],
- [
- "¦",
- "Ĥ"
- ],
- [
- "å·¥",
- "åħ·"
- ],
- [
- "åķ",
- "¡"
- ],
- [
- "Ġst",
- "ill"
- ],
- [
- "cho",
- "ol"
- ],
- [
- "æĦŁ",
- "åΰ"
- ],
- [
- "çĶŁ",
- "çī©"
- ],
- [
- "åĴĸ",
- "åķ¡"
- ],
- [
- "åĩĨ",
- "å¤ĩ"
- ],
- [
- "Ġw",
- "aste"
- ],
- [
- "Ġev",
- "ents"
- ],
- [
- "æķĻ",
- "èĤ²"
- ],
- [
- "Ġ",
- "8"
- ],
- [
- "Ġm",
- "ust"
- ],
- [
- "i",
- "ed"
- ],
- [
- "as",
- "ing"
- ],
- [
- "å½¢",
- "æĪIJ"
- ],
- [
- "Ġproduct",
- "s"
- ],
- [
- "åħ",
- "¸"
- ],
- [
- "è®",
- "²"
- ],
- [
- "f",
- "ter"
- ],
- [
- "å·",
- "®"
- ],
- [
- "l",
- "ess"
- ],
- [
- "Ġc",
- "ro"
- ],
- [
- "Ġfin",
- "an"
- ],
- [
- "åıį",
- "åºĶ"
- ],
- [
- "åĪĽ",
- "éĢł"
- ],
- [
- "Ġguid",
- "elines"
- ],
- [
- "åĪ",
- "¤"
- ],
- [
- "ä½ľ",
- "åĵģ"
- ],
- [
- "表",
- "示"
- ],
- [
- "å¼",
- "Ĥ"
- ],
- [
- "Ġknow",
- "n"
- ],
- [
- "Ġt",
- "est"
- ],
- [
- "è¯",
- "¯"
- ],
- [
- "o",
- "pe"
- ],
- [
- "Ġus",
- "ers"
- ],
- [
- "A",
- "I"
- ],
- [
- "å¾",
- "·"
- ],
- [
- "ne",
- "w"
- ],
- [
- "è¿",
- "½"
- ],
- [
- "iqu",
- "es"
- ],
- [
- "模",
- "åŀĭ"
- ],
- [
- "åĬĽ",
- "åĴĮ"
- ],
- [
- "Ġhist",
- "ory"
- ],
- [
- "ĠA",
- "l"
- ],
- [
- "æĬķ",
- "èµĦ"
- ],
- [
- "å°Ŀ",
- "è¯ķ"
- ],
- [
- "an",
- "k"
- ],
- [
- "Ġh",
- "ome"
- ],
- [
- "éĴ",
- "Ł"
- ],
- [
- "ä¸",
- "°"
- ],
- [
- "èĪĴ",
- "éĢĤ"
- ],
- [
- "Ġincre",
- "ase"
- ],
- [
- "Ġh",
- "ab"
- ],
- [
- "åĪ",
- "»"
- ],
- [
- "è¾ĵ",
- "åĩº"
- ],
- [
- "Ġlead",
- "ing"
- ],
- [
- "Ġ",
- "7"
- ],
- [
- "é£İ",
- "éĻ©"
- ],
- [
- "Ġperform",
- "ance"
- ],
- [
- "Ġha",
- "pp"
- ],
- [
- "åŃ",
- "£"
- ],
- [
- "Ġst",
- "and"
- ],
- [
- "t",
- "y"
- ],
- [
- "ç¦",
- "ı"
- ],
- [
- "Ġcustom",
- "ers"
- ],
- [
- "åį",
- "İ"
- ],
- [
- "Ġbel",
- "ie"
- ],
- [
- "Ġcompan",
- "y"
- ],
- [
- "å½",
- "ķ"
- ],
- [
- "é£Ł",
- "çī©"
- ],
- [
- "ĠU",
- "n"
- ],
- [
- "Ġsu",
- "mm"
- ],
- [
- "re",
- "nt"
- ],
- [
- "ĠC",
- "on"
- ],
- [
- "éĢĤ",
- "éĩı"
- ],
- [
- "an",
- "ced"
- ],
- [
- "Ġ",
- "i"
- ],
- [
- "Ġl",
- "ight"
- ],
- [
- "Ġanaly",
- "sis"
- ],
- [
- "å°",
- "Ĭ"
- ],
- [
- "ĠU",
- "se"
- ],
- [
- "ou",
- "se"
- ],
- [
- "t",
- "ed"
- ],
- [
- "Ġchar",
- "act"
- ],
- [
- "Ġ",
- "#"
- ],
- [
- "t",
- "o"
- ],
- [
- "ç»",
- "ľ"
- ],
- [
- "ä¸į",
- "æĺ¯"
- ],
- [
- "Ġdevelop",
- "ing"
- ],
- [
- "åŁ",
- "¹"
- ],
- [
- "Ġstrate",
- "gies"
- ],
- [
- "Ġm",
- "ight"
- ],
- [
- "çŁ",
- "Ń"
- ],
- [
- "çļĦæ",
- "İ"
- ],
- [
- "Ġf",
- "irst"
- ],
- [
- "èĥ",
- "Į"
- ],
- [
- "çĮ",
- "«"
- ],
- [
- "Ġinclud",
- "es"
- ],
- [
- "åĽ",
- "Ń"
- ],
- [
- "Ġdi",
- "agn"
- ],
- [
- "Ġgrow",
- "th"
- ],
- [
- "ä¸ĵ",
- "ä¸ļ"
- ],
- [
- "Ġdo",
- "es"
- ],
- [
- "1",
- "2"
- ],
- [
- "ç»",
- "¿"
- ],
- [
- "Ġke",
- "ep"
- ],
- [
- "详",
- "ç»Ĩ"
- ],
- [
- "åĥ",
- "ı"
- ],
- [
- "åıij",
- "çĶŁ"
- ],
- [
- "f",
- "act"
- ],
- [
- "åı¯ä»¥",
- "åľ¨"
- ],
- [
- "ç«",
- "Ļ"
- ],
- [
- "æĭ",
- "ī"
- ],
- [
- "æµ",
- "İ"
- ],
- [
- "Ġchat",
- "bots"
- ],
- [
- "Ġbre",
- "ak"
- ],
- [
- "è¡",
- "¡"
- ],
- [
- "çŁ",
- "³"
- ],
- [
- "æĮģ",
- "ç»Ń"
- ],
- [
- "l",
- "ife"
- ],
- [
- "Ġ1",
- "0"
- ],
- [
- "æ´",
- "Ĺ"
- ],
- [
- "ĠAd",
- "ditionally"
- ],
- [
- "å£",
- "«"
- ],
- [
- "em",
- "ber"
- ],
- [
- "Ġgo",
- "als"
- ],
- [
- "å¾",
- "®"
- ],
- [
- "Ġv",
- "iew"
- ],
- [
- "Â",
- "·"
- ],
- [
- "o",
- "ve"
- ],
- [
- "åŁº",
- "ç¡"
- ],
- [
- "Ġoptim",
- "ize"
- ],
- [
- "Ġt",
- "em"
- ],
- [
- "Ġd",
- "own"
- ],
- [
- "åŁºç¡",
- "Ģ"
- ],
- [
- "è¶",
- "ħ"
- ],
- [
- "er",
- "cis"
- ],
- [
- "Ġl",
- "ess"
- ],
- [
- "e",
- "es"
- ],
- [
- "æĿ",
- "ĥ"
- ],
- [
- "Ġke",
- "y"
- ],
- [
- "Ġwor",
- "ks"
- ],
- [
- "è®",
- "¨"
- ],
- [
- "åı¥",
- "åŃIJ"
- ],
- [
- "Ġro",
- "bot"
- ],
- [
- "us",
- "s"
- ],
- [
- "åħ¨",
- "çIJĥ"
- ],
- [
- "ç»ı",
- "æµİ"
- ],
- [
- "æīį",
- "èĥ½"
- ],
- [
- "eg",
- "r"
- ],
- [
- "ä»ĸ们",
- "çļĦ"
- ],
- [
- "äº",
- "Ķ"
- ],
- [
- "èµ·",
- "æĿ¥"
- ],
- [
- "ç",
- "ĵ"
- ],
- [
- "Ġfact",
- "ors"
- ],
- [
- "Ġcult",
- "ural"
- ],
- [
- "æľ",
- "¨"
- ],
- [
- "Ġwork",
- "ing"
- ],
- [
- "ä¼",
- "¼"
- ],
- [
- "èIJ",
- "½"
- ],
- [
- "éĢŁ",
- "度"
- ],
- [
- "ä½",
- "ı"
- ],
- [
- "Ġeffect",
- "s"
- ],
- [
- "å©",
- "ļ"
- ],
- [
- "b",
- "r"
- ],
- [
- "åİ",
- "ħ"
- ],
- [
- "ra",
- "in"
- ],
- [
- "\"",
- ")"
- ],
- [
- "åѦ",
- "çĶŁ"
- ],
- [
- "\"",
- ","
- ],
- [
- "Ġp",
- "ar"
- ],
- [
- "at",
- "form"
- ],
- [
- "Ġens",
- "uring"
- ],
- [
- "çͱ",
- "äºİ"
- ],
- [
- "Ġm",
- "uch"
- ],
- [
- "Ġwor",
- "ds"
- ],
- [
- "Ġm",
- "ar"
- ],
- [
- "ç»ı",
- "éªĮ"
- ],
- [
- "为",
- "äºĨ"
- ],
- [
- "åIJĪ",
- "ä½ľ"
- ],
- [
- "v",
- "en"
- ],
- [
- "Ġ",
- "/"
- ],
- [
- "Ġfinan",
- "cial"
- ],
- [
- "wor",
- "k"
- ],
- [
- "or",
- "ies"
- ],
- [
- "æ²",
- "»"
- ],
- [
- "Ġtechn",
- "iques"
- ],
- [
- "æĭ¥",
- "æľī"
- ],
- [
- "ra",
- "p"
- ],
- [
- "å°",
- "Ķ"
- ],
- [
- "Ġ",
- "est"
- ],
- [
- "Ġavail",
- "able"
- ],
- [
- "Ġl",
- "it"
- ],
- [
- "æ",
- "¹"
- ],
- [
- "Ġeff",
- "icient"
- ],
- [
- "el",
- "s"
- ],
- [
- "o",
- "ver"
- ],
- [
- "Ġl",
- "and"
- ],
- [
- "Ġare",
- "a"
- ],
- [
- "Ġint",
- "ellig"
- ],
- [
- "Ġpre",
- "f"
- ],
- [
- "at",
- "ure"
- ],
- [
- "çŁ¥",
- "è¯Ĩ"
- ],
- [
- "æĵ",
- "įä½ľ"
- ],
- [
- "å¾",
- "ħ"
- ],
- [
- "ig",
- "ate"
- ],
- [
- "çļĦæ",
- "Ķ"
- ],
- [
- "Ġme",
- "an"
- ],
- [
- "b",
- "o"
- ],
- [
- "Ġcontro",
- "l"
- ],
- [
- "éĩĩ",
- "ç͍"
- ],
- [
- "ric",
- "ult"
- ],
- [
- "Ġprogra",
- "mm"
- ],
- [
- "Ġto",
- "wards"
- ],
- [
- "th",
- "ing"
- ],
- [
- "ä¸į",
- "è¦ģ"
- ],
- [
- "Ġth",
- "ough"
- ],
- [
- "å½",
- "©"
- ],
- [
- "Ġc",
- "ertain"
- ],
- [
- "Ġw",
- "ild"
- ],
- [
- "ä»",
- "Ĭ"
- ],
- [
- "Ġcons",
- "ervation"
- ],
- [
- "çŁ¥",
- "éģĵ"
- ],
- [
- "Ġreal",
- "ly"
- ],
- [
- "çļĦ",
- "åľ°"
- ],
- [
- "i",
- "o"
- ],
- [
- "é¥",
- "°"
- ],
- [
- "Ġf",
- "ul"
- ],
- [
- "çݯ",
- "ä¿Ŀ"
- ],
- [
- "Ġexplo",
- "re"
- ],
- [
- "çļĦæ",
- "¸"
- ],
- [
- "Ġdivers",
- "e"
- ],
- [
- "åĬł",
- "强"
- ],
- [
- "çļ",
- "®"
- ],
- [
- "Ġemot",
- "ions"
- ],
- [
- "Ġav",
- "oid"
- ],
- [
- "'",
- "ll"
- ],
- [
- "çļĦæ",
- "ī"
- ],
- [
- "åį",
- "¡"
- ],
- [
- "Ġpl",
- "atform"
- ],
- [
- "an",
- "ces"
- ],
- [
- "Ġsit",
- "u"
- ],
- [
- "ä»",
- "ĺ"
- ],
- [
- "ä½į",
- "ç½®"
- ],
- [
- "or",
- "ing"
- ],
- [
- "çĽ",
- "IJ"
- ],
- [
- "ä¸",
- "ĩ"
- ],
- [
- "Ġde",
- "v"
- ],
- [
- "n",
- "ov"
- ],
- [
- "as",
- "h"
- ],
- [
- "Ġtw",
- "o"
- ],
- [
- "å®",
- "ł"
- ],
- [
- "b",
- "on"
- ],
- [
- "èµ",
- "°"
- ],
- [
- "åĪĹ",
- "表"
- ],
- [
- "Ġc",
- "y"
- ],
- [
- "èį",
- "IJ"
- ],
- [
- "ĠS",
- "ome"
- ],
- [
- "Ġexpl",
- "ain"
- ],
- [
- "Ġa",
- "ware"
- ],
- [
- "社",
- "交"
- ],
- [
- "d",
- "ay"
- ],
- [
- "åı",
- "Į"
- ],
- [
- "æ²",
- "ŁéĢļ"
- ],
- [
- "æ°",
- "§"
- ],
- [
- "å¼Ģ",
- "åıij"
- ],
- [
- "åħ¬åı¸",
- "çļĦ"
- ],
- [
- "Ġa",
- "ir"
- ],
- [
- "åĩ",
- "»"
- ],
- [
- "ar",
- "ing"
- ],
- [
- "éĥ½",
- "æĺ¯"
- ],
- [
- "Ġlevel",
- "s"
- ],
- [
- "od",
- "s"
- ],
- [
- "Ġste",
- "ps"
- ],
- [
- "Ġc",
- "ap"
- ],
- [
- "æ´",
- "ŀ"
- ],
- [
- "é©",
- "¬"
- ],
- [
- "Ġret",
- "urn"
- ],
- [
- "Ġm",
- "et"
- ],
- [
- "çĶŁ",
- "æĢģ"
- ],
- [
- "丰",
- "å¯Į"
- ],
- [
- "æŁ",
- "ĵ"
- ],
- [
- "æīĢ",
- "以"
- ],
- [
- "é¡",
- "»"
- ],
- [
- "Ġ",
- "er"
- ],
- [
- "Ġf",
- "ra"
- ],
- [
- "3",
- "0"
- ],
- [
- "è",
- "ĵ"
- ],
- [
- "âĢ",
- "Ķ"
- ],
- [
- "Ġ",
- "å½ĵ"
- ],
- [
- "a",
- "h"
- ],
- [
- "ä¿",
- "ĥ"
- ],
- [
- "Ġlike",
- "ly"
- ],
- [
- "ĠĠĠĠĠĠĠĠ",
- "ĠĠĠĠĠĠĠĠ"
- ],
- [
- "åĪ",
- "Ŀ"
- ],
- [
- "Ġcreat",
- "ing"
- ],
- [
- "Ġf",
- "arm"
- ],
- [
- "Ġb",
- "al"
- ],
- [
- "Ġl",
- "ives"
- ],
- [
- "å®ĥ",
- "çļĦ"
- ],
- [
- "Ġab",
- "ility"
- ],
- [
- "ä¸Ĭ",
- "çļĦ"
- ],
- [
- "Ġsent",
- "ence"
- ],
- [
- "åĤ",
- "¨"
- ],
- [
- "Ġr",
- "out"
- ],
- [
- "Ġprovid",
- "es"
- ],
- [
- "Ġag",
- "ain"
- ],
- [
- "å®ł",
- "çī©"
- ],
- [
- "éĢ",
- "IJ"
- ],
- [
- "Ġyear",
- "s"
- ],
- [
- "èŀ",
- "į"
- ],
- [
- "Ġphys",
- "ical"
- ],
- [
- "P",
- "ython"
- ],
- [
- "ĠE",
- "x"
- ],
- [
- "it",
- "ing"
- ],
- [
- "è°ĥ",
- "æķ´"
- ],
- [
- "ç½ij",
- "绾"
- ],
- [
- "æħ",
- "¢"
- ],
- [
- "空",
- "éĹ´"
- ],
- [
- "åĽ",
- "°"
- ],
- [
- "è±",
- "Ĩ"
- ],
- [
- "æĽ´å¤ļ",
- "çļĦ"
- ],
- [
- "ĠA",
- "r"
- ],
- [
- "Ġmain",
- "tain"
- ],
- [
- "å®ŀ",
- "éĻħ"
- ],
- [
- "Ġtra",
- "vel"
- ],
- [
- "Ġs",
- "at"
- ],
- [
- "p",
- "ro"
- ],
- [
- "ç͵",
- "åŃIJ"
- ],
- [
- "æ±",
- "½"
- ],
- [
- "e",
- "x"
- ],
- [
- "åģ",
- "ĩ"
- ],
- [
- "æIJ",
- "Ń"
- ],
- [
- "éļı",
- "çĿĢ"
- ],
- [
- "è¿ĺ",
- "æľī"
- ],
- [
- "ç¤",
- "¼"
- ],
- [
- "al",
- "e"
- ],
- [
- "Ġcons",
- "um"
- ],
- [
- "Ċ",
- "Ġ"
- ],
- [
- "n",
- "cy"
- ],
- [
- "Ġquest",
- "ions"
- ],
- [
- "f",
- "ort"
- ],
- [
- "m",
- "aking"
- ],
- [
- "Ġdes",
- "c"
- ],
- [
- "1",
- "5"
- ],
- [
- "Ġinvol",
- "ves"
- ],
- [
- "Ġst",
- "ress"
- ],
- [
- "åŃĹ",
- "符"
- ],
- [
- "he",
- "re"
- ],
- [
- "Ġimpact",
- "s"
- ],
- [
- "Ġex",
- "ercis"
- ],
- [
- "åĿ",
- "ļ"
- ],
- [
- "led",
- "ge"
- ],
- [
- "ç§ij",
- "æĬĢ"
- ],
- [
- "oc",
- "i"
- ],
- [
- "Ġeffective",
- "ly"
- ],
- [
- "æ¶Ī",
- "è´¹"
- ],
- [
- "Ġconc",
- "lusion"
- ],
- [
- "éĺ",
- "ħ"
- ],
- [
- "Ġst",
- "re"
- ],
- [
- "iss",
- "ions"
- ],
- [
- "æ·",
- "»"
- ],
- [
- "I",
- "t"
- ],
- [
- "éĿ",
- "Ļ"
- ],
- [
- "Ġv",
- "irtual"
- ],
- [
- "è¡",
- "£"
- ],
- [
- "Ġachie",
- "ve"
- ],
- [
- "our",
- "ce"
- ],
- [
- "è¿",
- "ŀ"
- ],
- [
- "ac",
- "ks"
- ],
- [
- "表",
- "æł¼"
- ],
- [
- "Ġimport",
- "ance"
- ],
- [
- "èĩª",
- "æĪij"
- ],
- [
- "The",
- "se"
- ],
- [
- "n",
- "um"
- ],
- [
- "çļĦæ",
- "ł"
- ],
- [
- "Ġrelations",
- "hips"
- ],
- [
- "Ġwork",
- "ers"
- ],
- [
- "g",
- "ical"
- ],
- [
- "or",
- "por"
- ],
- [
- "ers",
- "on"
- ],
- [
- "åij",
- "¢"
- ],
- [
- "nd",
- "s"
- ],
- [
- "æİ¨",
- "èįIJ"
- ],
- [
- "oh",
- "n"
- ],
- [
- "å¿ħ",
- "é¡»"
- ],
- [
- "容",
- "æĺĵ"
- ],
- [
- "ĠG",
- "o"
- ],
- [
- "Ġt",
- "ell"
- ],
- [
- "ĠR",
- "es"
- ],
- [
- "on",
- "om"
- ],
- [
- "Ġbe",
- "c"
- ],
- [
- "æ³",
- "Ľ"
- ],
- [
- "p",
- "os"
- ],
- [
- "Ġmo",
- "ve"
- ],
- [
- "Ġst",
- "ory"
- ],
- [
- "æŃ",
- "¢"
- ],
- [
- "Ġprior",
- "it"
- ],
- [
- "Ġindust",
- "ries"
- ],
- [
- "è",
- "ľ"
- ],
- [
- "Ġposs",
- "ible"
- ],
- [
- "ĠM",
- "an"
- ],
- [
- "Ġexp",
- "ress"
- ],
- [
- "ab",
- "ilities"
- ],
- [
- "Ġint",
- "egr"
- ],
- [
- "代",
- "表"
- ],
- [
- "Ġrespon",
- "d"
- ],
- [
- "åĪĨ",
- "éĴŁ"
- ],
- [
- "æľº",
- "ä¼ļ"
- ],
- [
- "Ġth",
- "ings"
- ],
- [
- "交",
- "æµģ"
- ],
- [
- "Ġm",
- "eth"
- ],
- [
- "ur",
- "ther"
- ],
- [
- "Ġw",
- "ide"
- ],
- [
- "èij",
- "Ĺ"
- ],
- [
- "æĪij",
- "çļĦ"
- ],
- [
- "ĸçķ",
- "¥"
- ],
- [
- "id",
- "es"
- ],
- [
- "eth",
- "ing"
- ],
- [
- "ĠWh",
- "ile"
- ],
- [
- "p",
- "an"
- ],
- [
- "çŃ",
- "ĸçķ¥"
- ],
- [
- "Ġc",
- "ent"
- ],
- [
- "Ġp",
- "lease"
- ],
- [
- "olo",
- "gy"
- ],
- [
- "ura",
- "cy"
- ],
- [
- "å¾",
- "ª"
- ],
- [
- "w",
- "ard"
- ],
- [
- "n",
- "ce"
- ],
- [
- "Ġthe",
- "n"
- ],
- [
- "çª",
- "ģ"
- ],
- [
- "å¥",
- "ĩ"
- ],
- [
- "Ġb",
- "lo"
- ],
- [
- "a",
- "i"
- ],
- [
- "æŀ",
- "Ĺ"
- ],
- [
- "ç®Ĺ",
- "æ³ķ"
- ],
- [
- "ç»",
- "¼"
- ],
- [
- "Ġpr",
- "int"
- ],
- [
- "ac",
- "es"
- ],
- [
- "l",
- "u"
- ],
- [
- "ª",
- "æĸ½"
- ],
- [
- "p",
- "re"
- ],
- [
- "çļĦæĦ",
- "ı"
- ],
- [
- "Ġs",
- "ol"
- ],
- [
- "Ġover",
- "all"
- ],
- [
- "h",
- "old"
- ],
- [
- "Ġ",
- "es"
- ],
- [
- "çļĦ",
- "ä¸Ģ"
- ],
- [
- "éģ",
- "ĩ"
- ],
- [
- "Ġpop",
- "ul"
- ],
- [
- "å°ı",
- "说"
- ],
- [
- "æ³",
- "¢"
- ],
- [
- "åį",
- "ģ"
- ],
- [
- "ä¹Ł",
- "åı¯ä»¥"
- ],
- [
- "é£Ł",
- "åĵģ"
- ],
- [
- "Ġcont",
- "ent"
- ],
- [
- "å°",
- "Ħ"
- ],
- [
- "Ġrequ",
- "ires"
- ],
- [
- "æ£Ģ",
- "æŁ¥"
- ],
- [
- "ĊĠĠĠĠĠĠĠĠ",
- "ĠĠĠ"
- ],
- [
- "Ġgrou",
- "ps"
- ],
- [
- "Ġf",
- "air"
- ],
- [
- "Ġb",
- "l"
- ],
- [
- "å®ŀ",
- "éªĮ"
- ],
- [
- "æĮī",
- "çħ§"
- ],
- [
- "os",
- "p"
- ],
- [
- "st",
- "r"
- ],
- [
- "ä¸į",
- "èĥ½"
- ],
- [
- "Ġh",
- "arm"
- ],
- [
- "Ġpro",
- "du"
- ],
- [
- "çļĦæĬ",
- "Ģ"
- ],
- [
- "ç",
- "ĩ"
- ],
- [
- "t",
- "le"
- ],
- [
- "Ġanim",
- "als"
- ],
- [
- "è§Ĵ",
- "èī²"
- ],
- [
- "le",
- "v"
- ],
- [
- "æ¸",
- "IJ"
- ],
- [
- "å¤į",
- "æĿĤ"
- ],
- [
- "Ġde",
- "pend"
- ],
- [
- "æĮ",
- "ijæĪĺ"
- ],
- [
- "åĮħ",
- "åIJ«"
- ],
- [
- "Ġhelp",
- "s"
- ],
- [
- "Ġop",
- "en"
- ],
- [
- "Ġn",
- "et"
- ],
- [
- "ĠĠĠĠ",
- "Ġ"
- ],
- [
- "Ġstr",
- "ong"
- ],
- [
- "Ġj",
- "our"
- ],
- [
- "广",
- "æ³Ľ"
- ],
- [
- "æķ´",
- "个"
- ],
- [
- "Ġe",
- "lect"
- ],
- [
- "Ġrespon",
- "se"
- ],
- [
- "åįķ",
- "è¯į"
- ],
- [
- "æľ",
- "ĭ"
- ],
- [
- "Ġ",
- "<"
- ],
- [
- "åĮĸ",
- "åѦ"
- ],
- [
- "éĴ",
- "Ī"
- ],
- [
- "Ġqu",
- "ick"
- ],
- [
- "ual",
- "ly"
- ],
- [
- "Ġsom",
- "ething"
- ],
- [
- "Ġtra",
- "ck"
- ],
- [
- "度",
- "åĴĮ"
- ],
- [
- "eren",
- "ces"
- ],
- [
- "æł",
- "ij"
- ],
- [
- "Ġacc",
- "uracy"
- ],
- [
- "Ġex",
- "c"
- ],
- [
- "é£",
- "ŀ"
- ],
- [
- "Ġfiel",
- "d"
- ],
- [
- "寻",
- "æī¾"
- ],
- [
- "éħ",
- "¸"
- ],
- [
- "Ġh",
- "ope"
- ],
- [
- "ç",
- "ij"
- ],
- [
- "Ġin",
- "nov"
- ],
- [
- "ç»",
- "ª"
- ],
- [
- "al",
- "k"
- ],
- [
- "Ġtyp",
- "es"
- ],
- [
- "Ġd",
- "id"
- ],
- [
- "åĬ",
- "ª"
- ],
- [
- "Ġc",
- "all"
- ],
- [
- "è¯",
- "Ĺ"
- ],
- [
- "Ġear",
- "ly"
- ],
- [
- "ĠO",
- "ne"
- ],
- [
- "a",
- "pp"
- ],
- [
- "Ġcomm",
- "on"
- ],
- [
- "æľĢ",
- "ç»Ī"
- ],
- [
- "Ġche",
- "ck"
- ],
- [
- "Ġs",
- "ym"
- ],
- [
- "çĤ",
- "Ĵ"
- ],
- [
- "æĬĢ",
- "èĥ½"
- ],
- [
- "Ġen",
- "h"
- ],
- [
- "Ġag",
- "ricult"
- ],
- [
- "Ġim",
- "m"
- ],
- [
- "ç»",
- "ĩ"
- ],
- [
- "满",
- "è¶³"
- ],
- [
- "Ġs",
- "chool"
- ],
- [
- "b",
- "al"
- ],
- [
- "Ġfollow",
- "ing"
- ],
- [
- "b",
- "ased"
- ],
- [
- "Ġwe",
- "bs"
- ],
- [
- "Ġcult",
- "ure"
- ],
- [
- "ĠC",
- "om"
- ],
- [
- "w",
- "ay"
- ],
- [
- "ä¸Ģ",
- "å®ļ"
- ],
- [
- "åķĨ",
- "åĵģ"
- ],
- [
- "ud",
- "e"
- ],
- [
- "çļĦ",
- "åıijå±ķ"
- ],
- [
- "çĶŁ",
- "产"
- ],
- [
- "os",
- "ystem"
- ],
- [
- "Ġpl",
- "ant"
- ],
- [
- "åı",
- "¶"
- ],
- [
- "åIJ",
- "ĥ"
- ],
- [
- "ä»ĸ",
- "çļĦ"
- ],
- [
- "d",
- "er"
- ],
- [
- "è¯",
- "¢"
- ],
- [
- "å®¶",
- "åħ·"
- ],
- [
- "Ġf",
- "ree"
- ],
- [
- "ç§",
- "»"
- ],
- [
- "æİ",
- "Į"
- ],
- [
- "Ġb",
- "ody"
- ],
- [
- "Ġp",
- "resent"
- ],
- [
- "Ġpartic",
- "ularly"
- ],
- [
- "Ġchild",
- "ren"
- ],
- [
- "Ġstud",
- "ent"
- ],
- [
- ")",
- "."
- ],
- [
- "çī¹",
- "å¾ģ"
- ],
- [
- "è",
- "Ķ"
- ],
- [
- "éĺħ",
- "读"
- ],
- [
- "æķĪ",
- "çİĩ"
- ],
- [
- "Ġprogra",
- "m"
- ],
- [
- "éħ",
- "±"
- ],
- [
- "åıĺ",
- "å¾Ĺ"
- ],
- [
- "i",
- "x"
- ],
- [
- "Ġcom",
- "e"
- ],
- [
- "çļĦæ",
- "²"
- ],
- [
- "ĠT",
- "e"
- ],
- [
- "ĠT",
- "o"
- ],
- [
- "åħ±",
- "åIJĮ"
- ],
- [
- "Ġemploy",
- "ees"
- ],
- [
- "说",
- "æĺİ"
- ],
- [
- "Ġhe",
- "art"
- ],
- [
- "Ġm",
- "ot"
- ],
- [
- "æľĭ",
- "åıĭ"
- ],
- [
- "er",
- "ic"
- ],
- [
- "è¯",
- "ij"
- ],
- [
- "Ġcur",
- "rent"
- ],
- [
- "æĪIJ",
- "æľ¬"
- ],
- [
- "Ġto",
- "o"
- ],
- [
- "çİ©",
- "å®¶"
- ],
- [
- "åĪĽ",
- "æĸ°"
- ],
- [
- "Ġec",
- "osystem"
- ],
- [
- "常",
- "è§ģ"
- ],
- [
- "ä¸Ģ",
- "æŃ¥"
- ],
- [
- "Ġp",
- "res"
- ],
- [
- "Ġmult",
- "i"
- ],
- [
- "åijĬ",
- "è¯ī"
- ],
- [
- "ä¸",
- "¥"
- ],
- [
- "Ġm",
- "it"
- ],
- [
- "Ġact",
- "ion"
- ],
- [
- "çĨ",
- "Ł"
- ],
- [
- "Ġhab",
- "it"
- ],
- [
- "åı£",
- "æĦŁ"
- ],
- [
- "ç®",
- "±"
- ],
- [
- "Ġus",
- "es"
- ],
- [
- "å¢ŀ",
- "强"
- ],
- [
- "ç»Ļ",
- "åĩº"
- ],
- [
- "Ġ",
- "9"
- ],
- [
- "Ġde",
- "p"
- ],
- [
- "Ġeconom",
- "ic"
- ],
- [
- "æĢ§",
- "çļĦ"
- ],
- [
- "1",
- "8"
- ],
- [
- "åĨ",
- "°"
- ],
- [
- "Ġhelp",
- "ed"
- ],
- [
- "åIJ¸",
- "å¼ķ"
- ],
- [
- "çİ",
- "ĭ"
- ],
- [
- "Ġdiagn",
- "os"
- ],
- [
- "å",
- "ł"
- ],
- [
- "èģĶ",
- "ç³»"
- ],
- [
- "ç¾",
- "¤"
- ],
- [
- "ç»ĥ",
- "ä¹ł"
- ],
- [
- "æĪIJ",
- "éķ¿"
- ],
- [
- "Ġpo",
- "int"
- ],
- [
- "å®ļ",
- "æľŁ"
- ],
- [
- "åij",
- "¼"
- ],
- [
- "èį",
- "¯"
- ],
- [
- "æĿ",
- "¯"
- ],
- [
- "æ¤",
- "Ĵ"
- ],
- [
- "æķĪ",
- "æŀľ"
- ],
- [
- "Ġspec",
- "ial"
- ],
- [
- "æ·",
- "·"
- ],
- [
- "åĩł",
- "个"
- ],
- [
- "aus",
- "e"
- ],
- [
- "é",
- "Ĩ"
- ],
- [
- "æ¯Ķ",
- "èµĽ"
- ],
- [
- "è·",
- "Ŀ"
- ],
- [
- "W",
- "hat"
- ],
- [
- "Ġt",
- "imes"
- ],
- [
- "ic",
- "les"
- ],
- [
- "Ġ",
- "*"
- ],
- [
- "ç´",
- "§"
- ],
- [
- "å¦Ĥæŀľ",
- "ä½ł"
- ],
- [
- "çĭ¬",
- "çī¹"
- ],
- [
- "çģ",
- "µ"
- ],
- [
- "ç¨",
- "İ"
- ],
- [
- "Ġcar",
- "bon"
- ],
- [
- "Ġbi",
- "as"
- ],
- [
- "åĬ©",
- "äºİ"
- ],
- [
- "Ġcon",
- "st"
- ],
- [
- "èĩª",
- "çͱ"
- ],
- [
- "æĿ¥",
- "说"
- ],
- [
- "å°±",
- "æĺ¯"
- ],
- [
- "åį",
- "°"
- ],
- [
- "Ġme",
- "et"
- ],
- [
- "è§Ħ",
- "åĪĴ"
- ],
- [
- "çļĦç",
- "¾"
- ],
- [
- "èIJ¥",
- "åħ»"
- ],
- [
- "at",
- "ors"
- ],
- [
- "稳",
- "å®ļ"
- ],
- [
- "od",
- "e"
- ],
- [
- "çħ",
- "®"
- ],
- [
- "Ġass",
- "oci"
- ],
- [
- "å¿",
- "Ĺ"
- ],
- [
- "è¡Į",
- "æĺŁ"
- ],
- [
- "æĿ",
- "İ"
- ],
- [
- "Ġrev",
- "iew"
- ],
- [
- "åĩ",
- "Ģ"
- ],
- [
- "ĠR",
- "o"
- ],
- [
- "Ġknow",
- "ledge"
- ],
- [
- "以",
- "便"
- ],
- [
- "æµĭ",
- "è¯ķ"
- ],
- [
- "åIJĪ",
- "éĢĤ"
- ],
- [
- "s",
- "c"
- ],
- [
- "å½¢",
- "å¼ı"
- ],
- [
- "Ġfriend",
- "s"
- ],
- [
- "Ġnat",
- "ure"
- ],
- [
- "Ġcrit",
- "ical"
- ],
- [
- "æ´",
- "ĭ"
- ],
- [
- "Ġa",
- "fter"
- ],
- [
- "er",
- "ve"
- ],
- [
- "Ġre",
- "ce"
- ],
- [
- "çļĦæ",
- "Ń"
- ],
- [
- "æ±½",
- "车"
- ],
- [
- "çķ",
- "Į"
- ],
- [
- "Ġlo",
- "ss"
- ],
- [
- "Ġapp",
- "lications"
- ],
- [
- "å¤ļ",
- "ç§į"
- ],
- [
- "éĶ",
- "ħ"
- ],
- [
- "ä¸",
- "²"
- ],
- [
- "Ġins",
- "p"
- ],
- [
- "--",
- "-"
- ],
- [
- "ĠS",
- "h"
- ],
- [
- "Ġv",
- "ol"
- ],
- [
- "l",
- "ut"
- ],
- [
- "o",
- "ks"
- ],
- [
- "se",
- "qu"
- ],
- [
- "Ġb",
- "ir"
- ],
- [
- "åIJĪ",
- "çIJĨ"
- ],
- [
- "Ġne",
- "cess"
- ],
- [
- "æĪij",
- "æĥ³"
- ],
- [
- "çŃī",
- "æĸ¹éĿ¢"
- ],
- [
- "é¼",
- "ĵ"
- ],
- [
- "Ġso",
- "ft"
- ],
- [
- "Ġl",
- "ive"
- ],
- [
- "å°ı",
- "æĺİ"
- ],
- [
- "ĠI",
- "nd"
- ],
- [
- "Ġbr",
- "ing"
- ],
- [
- "æĺ¯",
- "æĮĩ"
- ],
- [
- "Ġso",
- "il"
- ],
- [
- "il",
- "ar"
- ],
- [
- "ä¸",
- "ľ"
- ],
- [
- "æĿ¡",
- "ä»¶"
- ],
- [
- "Ġt",
- "ri"
- ],
- [
- "äº",
- "®"
- ],
- [
- "Ġm",
- "om"
- ],
- [
- "æı",
- "¡"
- ],
- [
- "ä¼",
- "°"
- ],
- [
- "ŀ",
- "äºī"
- ],
- [
- "çĽ",
- "ij"
- ],
- [
- "èĤ",
- "¤"
- ],
- [
- "è´¢",
- "åĬ¡"
- ],
- [
- "æ·»",
- "åĬł"
- ],
- [
- "饮",
- "é£Ł"
- ],
- [
- "Ġallow",
- "ing"
- ],
- [
- "åº",
- "ķ"
- ],
- [
- "Ġr",
- "ight"
- ],
- [
- "Ġexp",
- "ert"
- ],
- [
- "Ġsu",
- "pp"
- ],
- [
- "Ġin",
- "it"
- ],
- [
- "çļĦæ",
- "µ"
- ],
- [
- "ar",
- "get"
- ],
- [
- "Ġexp",
- "ect"
- ],
- [
- "Ġ1",
- "9"
- ],
- [
- "Ġmeas",
- "ures"
- ],
- [
- "olut",
- "ions"
- ],
- [
- "j",
- "ust"
- ],
- [
- "ar",
- "c"
- ],
- [
- "å°",
- "ļ"
- ],
- [
- "Ġpract",
- "ice"
- ],
- [
- "æľī",
- "åĬ©äºİ"
- ],
- [
- "大",
- "éĩı"
- ],
- [
- "'",
- ","
- ],
- [
- "im",
- "ent"
- ],
- [
- "Ġcontin",
- "ue"
- ],
- [
- "Ġdisc",
- "uss"
- ],
- [
- "1",
- "00"
- ],
- [
- "éļ",
- "ľ"
- ],
- [
- "çļĦæĦ",
- "Ł"
- ],
- [
- "Ġref",
- "lect"
- ],
- [
- "it",
- "ation"
- ],
- [
- "åį",
- "«"
- ],
- [
- "äºĨ",
- "ä¸Ģ"
- ],
- [
- "ne",
- "y"
- ],
- [
- "ĠL",
- "e"
- ],
- [
- "is",
- "ed"
- ],
- [
- "è¶",
- "ĭ"
- ],
- [
- "äºĨ",
- "ä¸Ģ个"
- ],
- [
- "Ġincre",
- "asing"
- ],
- [
- "çļĦæ",
- "Į"
- ],
- [
- "Ġst",
- "ru"
- ],
- [
- "æĢ»",
- "ç»ĵ"
- ],
- [
- "e",
- "ly"
- ],
- [
- "å®",
- "ĩ"
- ],
- [
- "Ġaut",
- "hor"
- ],
- [
- "表",
- "éĿ¢"
- ],
- [
- "Ġ",
- "x"
- ],
- [
- "æķħ",
- "äºĭ"
- ],
- [
- "em",
- "ic"
- ],
- [
- "Ġrep",
- "resent"
- ],
- [
- "g",
- "er"
- ],
- [
- "Ġincre",
- "ased"
- ],
- [
- "on",
- "es"
- ],
- [
- "ain",
- "s"
- ],
- [
- "Ġtrain",
- "ed"
- ],
- [
- "Ġf",
- "ish"
- ],
- [
- "Ġst",
- "ate"
- ],
- [
- "åĨ",
- "·"
- ],
- [
- "çĶŁ",
- "éķ¿"
- ],
- [
- "Ġre",
- "new"
- ],
- [
- "ord",
- "ing"
- ],
- [
- "åĮ",
- "Ĺ"
- ],
- [
- "æİ",
- "ªæĸ½"
- ],
- [
- "å¹³",
- "è¡¡"
- ],
- [
- "Ġsuccess",
- "ful"
- ],
- [
- "ä¸ĭ",
- "éĿ¢"
- ],
- [
- "Ġactiv",
- "ity"
- ],
- [
- "èĮ",
- "¶"
- ],
- [
- "éĢĤ",
- "åºĶ"
- ],
- [
- "èĦ",
- "ij"
- ],
- [
- "æİ¢",
- "ç´¢"
- ],
- [
- "ff",
- "ic"
- ],
- [
- "ç»Ħ",
- "æĪIJ"
- ],
- [
- "at",
- "ives"
- ],
- [
- "äº",
- "ļ"
- ],
- [
- "Ġsc",
- "en"
- ],
- [
- "æ²",
- "Ļ"
- ],
- [
- "g",
- "ress"
- ],
- [
- "使",
- "å¾Ĺ"
- ],
- [
- "æī",
- "¿"
- ],
- [
- "Ġdisc",
- "rim"
- ],
- [
- "Ġassist",
- "ants"
- ],
- [
- "Ġex",
- "ist"
- ],
- [
- "çķ",
- "Ļ"
- ],
- [
- "Ġsp",
- "ace"
- ],
- [
- "æľĢ",
- "è¿ij"
- ],
- [
- "Ġide",
- "as"
- ],
- [
- "éĩĩ",
- "åıĸ"
- ],
- [
- "l",
- "ight"
- ],
- [
- "注",
- "éĩį"
- ],
- [
- "çļĦæĹ¶",
- "éĹ´"
- ],
- [
- "è¿",
- "İ"
- ],
- [
- "Ġcom",
- "b"
- ],
- [
- "éĢĤ",
- "å½ĵ"
- ],
- [
- "Ġyour",
- "self"
- ],
- [
- "rit",
- "e"
- ],
- [
- "as",
- "on"
- ],
- [
- "åĮ",
- "Ģ"
- ],
- [
- "åı¯ä»¥",
- "使ç͍"
- ],
- [
- "åħħ",
- "满"
- ],
- [
- "Ġval",
- "ues"
- ],
- [
- "æ",
- "½"
- ],
- [
- "Ġbi",
- "ases"
- ],
- [
- "ä¿ĥ",
- "è¿Ľ"
- ],
- [
- "åľº",
- "æĻ¯"
- ],
- [
- "ro",
- "ss"
- ],
- [
- "åį³",
- "åı¯"
- ],
- [
- "Ġc",
- "ru"
- ],
- [
- "Ġnum",
- "ber"
- ],
- [
- "Ġty",
- "pe"
- ],
- [
- "r",
- "ast"
- ],
- [
- "åĩĨ",
- "ç¡®"
- ],
- [
- "Th",
- "is"
- ],
- [
- "Ġp",
- "ast"
- ],
- [
- "çģ",
- "¯"
- ],
- [
- "å®ļ",
- "ä¹ī"
- ],
- [
- "Ġs",
- "olutions"
- ],
- [
- "Ġt",
- "er"
- ],
- [
- "ä¿Ŀ",
- "è¯ģ"
- ],
- [
- "èĶ",
- "¬"
- ],
- [
- "å¹",
- "¸"
- ],
- [
- "åī",
- "§"
- ],
- [
- "åħ´",
- "è¶£"
- ],
- [
- "å",
- "ª"
- ],
- [
- "ent",
- "ion"
- ],
- [
- "av",
- "or"
- ],
- [
- "Ġsc",
- "ient"
- ],
- [
- "åĬª",
- "åĬĽ"
- ],
- [
- "Ġprovid",
- "ers"
- ],
- [
- "Ġpolic",
- "ies"
- ],
- [
- "al",
- "u"
- ],
- [
- "ĠI",
- "m"
- ],
- [
- "Ġallow",
- "s"
- ],
- [
- "Ġintellig",
- "ence"
- ],
- [
- "çļĦ",
- "æĸ¹æ³ķ"
- ],
- [
- "è¿Ļ",
- "æĺ¯"
- ],
- [
- "Ġ",
- "`"
- ],
- [
- "Ġem",
- "issions"
- ],
- [
- "Ġ",
- "å°Ĩ"
- ],
- [
- "Ġmean",
- "ing"
- ],
- [
- "Ġst",
- "yle"
- ],
- [
- "åİŁ",
- "åĽł"
- ],
- [
- "Ġstru",
- "gg"
- ],
- [
- "çļĦç¾",
- "İ"
- ],
- [
- "if",
- "ul"
- ],
- [
- "dit",
- "ion"
- ],
- [
- "éĥ½",
- "æľī"
- ],
- [
- "空",
- "æ°Ķ"
- ],
- [
- "å®ĥ们",
- "çļĦ"
- ],
- [
- "ä¼ĺ",
- "åĮĸ"
- ],
- [
- "Ġinf",
- "lu"
- ],
- [
- "åŁº",
- "äºİ"
- ],
- [
- "Ġdetail",
- "s"
- ],
- [
- "Ġtranspare",
- "ncy"
- ],
- [
- "Ġm",
- "ess"
- ],
- [
- "ĠC",
- "l"
- ],
- [
- "Ġg",
- "ame"
- ],
- [
- "p",
- "ri"
- ],
- [
- "è¶ĭ",
- "åĬ¿"
- ],
- [
- "å½",
- "Ĵ"
- ],
- [
- "ç¿»",
- "è¯ij"
- ],
- [
- "æķ",
- "£"
- ],
- [
- "B",
- "y"
- ],
- [
- "é",
- "Ń"
- ],
- [
- "ĠAm",
- "eric"
- ],
- [
- "Ġproduct",
- "ion"
- ],
- [
- "Ġinc",
- "orpor"
- ],
- [
- "æĻ",
- "ļ"
- ],
- [
- "Ġinvol",
- "ve"
- ],
- [
- "Ġh",
- "ot"
- ],
- [
- "æĻ",
- "®"
- ],
- [
- "b",
- "y"
- ],
- [
- "Ġf",
- "low"
- ],
- [
- "Ġem",
- "erg"
- ],
- [
- "åº",
- "§"
- ],
- [
- "Ġide",
- "a"
- ],
- [
- "åİĭ",
- "åĬĽ"
- ],
- [
- "éĿ",
- "Ĵ"
- ],
- [
- "om",
- "s"
- ],
- [
- "èģĮ",
- "ä¸ļ"
- ],
- [
- "Ġre",
- "port"
- ],
- [
- "Ġp",
- "ap"
- ],
- [
- "Ġthe",
- "rap"
- ],
- [
- "Ġs",
- "al"
- ],
- [
- "åıĤ",
- "ä¸İ"
- ],
- [
- "æĸĩ",
- "åѦ"
- ],
- [
- "æIJŃ",
- "éħį"
- ],
- [
- "o",
- "ot"
- ],
- [
- ")",
- ","
- ],
- [
- "Ġc",
- "r"
- ],
- [
- "Ġprocess",
- "es"
- ],
- [
- "g",
- "in"
- ],
- [
- "å¹³",
- "åı°"
- ],
- [
- "å¯",
- "Ł"
- ],
- [
- "Ġpromot",
- "ing"
- ],
- [
- "æļ",
- "ĸ"
- ],
- [
- "ake",
- "hold"
- ],
- [
- "ç»",
- "§"
- ],
- [
- "iv",
- "er"
- ],
- [
- "æ",
- "¦Ĥ"
- ],
- [
- "Ġmodel",
- "s"
- ],
- [
- "Ġd",
- "ra"
- ],
- [
- "è",
- "ĸ"
- ],
- [
- "Ġgrou",
- "p"
- ],
- [
- "è¶³",
- "å¤Ł"
- ],
- [
- "Ġg",
- "reen"
- ],
- [
- "Ġhealth",
- "y"
- ],
- [
- "Ġcom",
- "fort"
- ],
- [
- "Ġad",
- "ditional"
- ],
- [
- "ä¸Ģ",
- "次"
- ],
- [
- "é¤IJ",
- "åİħ"
- ],
- [
- "Ġmaterial",
- "s"
- ],
- [
- "Ġman",
- "age"
- ],
- [
- "çļĦæ",
- "¯"
- ],
- [
- "ä¼",
- "¤"
- ],
- [
- "åıĬ",
- "æĹ¶"
- ],
- [
- "Ġg",
- "lo"
- ],
- [
- "Ġst",
- "at"
- ],
- [
- "å¿«",
- "éĢŁ"
- ],
- [
- "Ġmonitor",
- "ing"
- ],
- [
- "ail",
- "y"
- ],
- [
- "ra",
- "nd"
- ],
- [
- "o",
- "ice"
- ],
- [
- "res",
- "h"
- ],
- [
- "ç»Ħ",
- "ç»ĩ"
- ],
- [
- "Ġund",
- "er"
- ],
- [
- "Ġnecess",
- "ary"
- ],
- [
- "Ġhelp",
- "ful"
- ],
- [
- "ĠC",
- "ol"
- ],
- [
- "é»ij",
- "æ´ŀ"
- ],
- [
- "åģļ",
- "åĩº"
- ],
- [
- "Ġcour",
- "se"
- ],
- [
- "Ġm",
- "at"
- ],
- [
- "Ġle",
- "g"
- ],
- [
- "Ġf",
- "ace"
- ],
- [
- "ä»",
- "¤"
- ],
- [
- "èī¯",
- "好çļĦ"
- ],
- [
- "oc",
- "k"
- ],
- [
- "åĮ»",
- "çĸĹ"
- ],
- [
- "çĽ",
- "ĸ"
- ],
- [
- "id",
- "ence"
- ],
- [
- "Ġassoci",
- "ated"
- ],
- [
- "Ġpro",
- "gress"
- ],
- [
- "åľ",
- "Ĩ"
- ],
- [
- "Ġevery",
- "one"
- ],
- [
- "ç¼",
- "ĵ"
- ],
- [
- "ĠEn",
- "g"
- ],
- [
- "w",
- "ord"
- ],
- [
- "èĵ",
- "Ŀ"
- ],
- [
- "天",
- "æ°Ķ"
- ],
- [
- "Ġact",
- "ions"
- ],
- [
- "em",
- "s"
- ],
- [
- "ĠP",
- "l"
- ],
- [
- "å®",
- "Ļ"
- ],
- [
- "us",
- "h"
- ],
- [
- "é¡",
- "¾"
- ],
- [
- "Ġcost",
- "s"
- ],
- [
- "at",
- "or"
- ],
- [
- "ç©",
- "¿"
- ],
- [
- "Ġamount",
- "s"
- ],
- [
- "èͬ",
- "èıľ"
- ],
- [
- ".",
- "."
- ],
- [
- "Ġman",
- "ner"
- ],
- [
- "Ġcon",
- "sequ"
- ],
- [
- "æ°Ķ",
- "åĢĻ"
- ],
- [
- "Ġins",
- "ights"
- ],
- [
- "be",
- "ing"
- ],
- [
- "at",
- "ory"
- ],
- [
- "en",
- "er"
- ],
- [
- "le",
- "x"
- ],
- [
- "Ġme",
- "ans"
- ],
- [
- "Ġcollabor",
- "ation"
- ],
- [
- "Ġpers",
- "pect"
- ],
- [
- "or",
- "m"
- ],
- [
- "pri",
- "ate"
- ],
- [
- "å°Ĭ",
- "éĩį"
- ],
- [
- "Ġt",
- "arget"
- ],
- [
- "è®°",
- "å½ķ"
- ],
- [
- "åĢ",
- "Ĵ"
- ],
- [
- "Ġrenew",
- "able"
- ],
- [
- "æĦ",
- "¿"
- ],
- [
- "èĥ½",
- "æºIJ"
- ],
- [
- "Ġin",
- "put"
- ],
- [
- "å®ĩ",
- "å®Ļ"
- ],
- [
- "a",
- "pe"
- ],
- [
- "Ġad",
- "just"
- ],
- [
- "er",
- "ies"
- ],
- [
- "Ġd",
- "ire"
- ],
- [
- "ä¾",
- "Ŀ"
- ],
- [
- "ust",
- "r"
- ],
- [
- "f",
- "ect"
- ],
- [
- "Ġbeaut",
- "iful"
- ],
- [
- "Ġd",
- "ue"
- ],
- [
- "re",
- "ci"
- ],
- [
- "çĮ",
- "®"
- ],
- [
- "èĥĮ",
- "æĻ¯"
- ],
- [
- "èĤ",
- "¡"
- ],
- [
- "Ġd",
- "am"
- ],
- [
- "i",
- "k"
- ],
- [
- "Ġadv",
- "anced"
- ],
- [
- "缸",
- "对"
- ],
- [
- "åIJį",
- "ç§°"
- ],
- [
- "Ġsh",
- "ort"
- ],
- [
- "Ġob",
- "ject"
- ],
- [
- "è¿Ļ",
- "éĩĮ"
- ],
- [
- "éĢł",
- "æĪIJ"
- ],
- [
- "èIJ¥",
- "éĶĢ"
- ],
- [
- "çļĦæĥħ",
- "æĦŁ"
- ],
- [
- "ç¥",
- "¨"
- ],
- [
- "Ġcount",
- "ries"
- ],
- [
- "in",
- "ing"
- ],
- [
- "ist",
- "ic"
- ],
- [
- "Ġpl",
- "ans"
- ],
- [
- "è´£",
- "ä»»"
- ],
- [
- "Ġst",
- "akehold"
- ],
- [
- "t",
- "he"
- ],
- [
- "Ġass",
- "ess"
- ],
- [
- "æĢĿ",
- "èĢĥ"
- ],
- [
- "e",
- "ch"
- ],
- [
- "æĪIJ",
- "åijĺ"
- ],
- [
- "2",
- "1"
- ],
- [
- "Ġd",
- "aily"
- ],
- [
- "Ġcomp",
- "ut"
- ],
- [
- "çļĦæĥħ",
- "åĨµ"
- ],
- [
- "æıIJ",
- "åĩº"
- ],
- [
- "Ġ",
- "âĢľ"
- ],
- [
- "åª",
- "Ĵ"
- ],
- [
- "ä¸Ń",
- "å¿ĥ"
- ],
- [
- "is",
- "hed"
- ],
- [
- "ĠS",
- "e"
- ],
- [
- "onom",
- "ous"
- ],
- [
- "er",
- "n"
- ],
- [
- "ç»´",
- "æĬ¤"
- ],
- [
- "am",
- "es"
- ],
- [
- "Ġpriorit",
- "ize"
- ],
- [
- "çº",
- "¸"
- ],
- [
- "èĤ",
- "¥"
- ],
- [
- "Ġtem",
- "per"
- ],
- [
- "æ¸ħ",
- "æ´ģ"
- ],
- [
- "us",
- "e"
- ],
- [
- "æ±",
- "¡"
- ],
- [
- "Ġmin",
- "im"
- ],
- [
- "æĺ¯",
- "åľ¨"
- ],
- [
- "大",
- "å°ı"
- ],
- [
- "åĵª",
- "äºĽ"
- ],
- [
- "Ġapp",
- "reci"
- ],
- [
- "ren",
- "g"
- ],
- [
- "Ġregul",
- "ations"
- ],
- [
- "Ġ",
- "Z"
- ],
- [
- "éĶĻ",
- "误"
- ],
- [
- "r",
- "ans"
- ],
- [
- "èĢĮ",
- "ä¸Ķ"
- ],
- [
- "èĪ",
- "¬"
- ],
- [
- "èij",
- "±"
- ],
- [
- "è",
- "Ĩ"
- ],
- [
- "æ°´",
- "å¹³"
- ],
- [
- "è´Ń",
- "çī©"
- ],
- [
- "åŃĹ符",
- "串"
- ],
- [
- "对",
- "æĸ¹"
- ],
- [
- "Ġh",
- "im"
- ],
- [
- "Ġconsequ",
- "ences"
- ],
- [
- "å·",
- "´"
- ],
- [
- "é¼ĵ",
- "åĬ±"
- ],
- [
- "Ġf",
- "il"
- ],
- [
- "人",
- "åijĺ"
- ],
- [
- "è·Ŀ",
- "离"
- ],
- [
- "ĠW",
- "hen"
- ],
- [
- "çļĦæ°",
- "´"
- ],
- [
- "çī©",
- "çIJĨ"
- ],
- [
- "åIJĮæĹ¶",
- "ä¹Ł"
- ],
- [
- "åľ¨",
- "è¿Ļ个"
- ],
- [
- "åħ¶",
- "次"
- ],
- [
- ",",
- "\""
- ],
- [
- "æ¶",
- "²"
- ],
- [
- "çĶ",
- "·"
- ],
- [
- "iv",
- "al"
- ],
- [
- "åı¯ä»¥",
- "让"
- ],
- [
- "æĥ",
- "¯"
- ],
- [
- "Ġadv",
- "ance"
- ],
- [
- "Ġve",
- "h"
- ],
- [
- "å¦Ĥæŀľ",
- "æĤ¨"
- ],
- [
- "Ġest",
- "ab"
- ],
- [
- "ri",
- "pt"
- ],
- [
- "ç«",
- "¯"
- ],
- [
- "ä¸į",
- "ä¼ļ"
- ],
- [
- "Ġtranspare",
- "nt"
- ],
- [
- "æķ°",
- "éĩı"
- ],
- [
- "çĽ",
- "ĺ"
- ],
- [
- "Ġspe",
- "ak"
- ],
- [
- "Ġp",
- "ark"
- ],
- [
- "Ġstakehold",
- "ers"
- ],
- [
- "é",
- "º"
- ],
- [
- "Ġev",
- "ent"
- ],
- [
- "çļĦæķ°",
- "æį®"
- ],
- [
- "èĩª",
- "åĬ¨"
- ],
- [
- "ç»Ĩ",
- "èĬĤ"
- ],
- [
- "è¯Ħ",
- "ä¼°"
- ],
- [
- "æ¶",
- "¦"
- ],
- [
- "Ġpref",
- "erences"
- ],
- [
- "Ġve",
- "get"
- ],
- [
- "æį",
- "Ł"
- ],
- [
- "e",
- "qu"
- ],
- [
- "Ġg",
- "l"
- ],
- [
- "Ġp",
- "ain"
- ],
- [
- "o",
- "gra"
- ],
- [
- "Ġtra",
- "ffic"
- ],
- [
- "Ġo",
- "ce"
- ],
- [
- "ä¹",
- "ĺ"
- ],
- [
- "e",
- "xt"
- ],
- [
- "âĢĿ",
- "ï¼Į"
- ],
- [
- "Ġan",
- "other"
- ],
- [
- "å¤ļ",
- "å°ij"
- ],
- [
- "Ġagain",
- "st"
- ],
- [
- "ç»ı",
- "åİĨ"
- ],
- [
- "计ç®Ĺ",
- "æľº"
- ],
- [
- "èĢ",
- "IJ"
- ],
- [
- "软",
- "ä»¶"
- ],
- [
- "ĠP",
- "re"
- ],
- [
- "Ġpl",
- "ants"
- ],
- [
- "缸",
- "äºĴ"
- ],
- [
- "é¢",
- "ij"
- ],
- [
- "\\",
- "_"
- ],
- [
- "Ġs",
- "ame"
- ],
- [
- "ru",
- "g"
- ],
- [
- "Ġval",
- "u"
- ],
- [
- "Ġo",
- "cc"
- ],
- [
- "çļĦç",
- "¤"
- ],
- [
- "Ġsustain",
- "ability"
- ],
- [
- "ĠS",
- "he"
- ],
- [
- "d",
- "e"
- ],
- [
- "ot",
- "e"
- ],
- [
- "Ġd",
- "ig"
- ],
- [
- "N",
- "A"
- ],
- [
- "Ġcru",
- "cial"
- ],
- [
- "æī",
- "§"
- ],
- [
- "å±",
- "Ģ"
- ],
- [
- "æĭ",
- "Ł"
- ],
- [
- "æĭ",
- "Į"
- ],
- [
- "Ġn",
- "on"
- ],
- [
- "Ġeng",
- "aging"
- ],
- [
- "Ġinter",
- "n"
- ],
- [
- "L",
- "P"
- ],
- [
- "温",
- "度"
- ],
- [
- "æł",
- "¸"
- ],
- [
- "æĬ¥",
- "åijĬ"
- ],
- [
- "æĿ¥",
- "è¶Ĭ"
- ],
- [
- "h",
- "ood"
- ],
- [
- "ä¸ī",
- "个"
- ],
- [
- "å¦Ĥ",
- "ä¸ĭ"
- ],
- [
- "çī©",
- "ä½ĵ"
- ],
- [
- "for",
- "ce"
- ],
- [
- "Ġneed",
- "ed"
- ],
- [
- "Ġim",
- "ages"
- ],
- [
- "Ġbuild",
- "ing"
- ],
- [
- "ici",
- "ous"
- ],
- [
- "Ġ",
- "æĪij"
- ],
- [
- "è¶Ĭ",
- "æĿ¥è¶Ĭ"
- ],
- [
- "æĶ¾",
- "åħ¥"
- ],
- [
- "g",
- "o"
- ],
- [
- "éĻį",
- "ä½İ"
- ],
- [
- "å½ĵ",
- "åľ°"
- ],
- [
- "æ¶Īè´¹",
- "èĢħ"
- ],
- [
- "ç",
- "£"
- ],
- [
- "ivers",
- "ity"
- ],
- [
- "é¢Ħ",
- "ç®Ĺ"
- ],
- [
- "ic",
- "le"
- ],
- [
- "æ··",
- "åIJĪ"
- ],
- [
- "Ġpartic",
- "ip"
- ],
- [
- "Ġdis",
- "hes"
- ],
- [
- "Ġthrough",
- "out"
- ],
- [
- "Ġwith",
- "in"
- ],
- [
- "åı",
- "³"
- ],
- [
- "é«ĺ",
- "çļĦ"
- ],
- [
- "Ġph",
- "ot"
- ],
- [
- "Ġtr",
- "ust"
- ],
- [
- "æĦı",
- "è¯Ĩ"
- ],
- [
- "以",
- "ç¡®ä¿Ŀ"
- ],
- [
- "çĬ¶",
- "æĢģ"
- ],
- [
- "Ġautom",
- "ation"
- ],
- [
- "1",
- "1"
- ],
- [
- "Ġpo",
- "st"
- ],
- [
- "æīĭ",
- "æľº"
- ],
- [
- "wor",
- "ks"
- ],
- [
- "éĢ",
- "ı"
- ],
- [
- "åº",
- "ĵ"
- ],
- [
- "Ġw",
- "ind"
- ],
- [
- "Ġ=",
- "="
- ],
- [
- "Ġprocess",
- "ing"
- ],
- [
- "èĮĥ",
- "åĽ´"
- ],
- [
- "æĦı",
- "ä¹ī"
- ],
- [
- "追",
- "æ±Ĥ"
- ],
- [
- "Ã",
- "©"
- ],
- [
- "å¾",
- "Ħ"
- ],
- [
- "éĿ",
- "ł"
- ],
- [
- "ä¸",
- "ĸ"
- ],
- [
- "èĻ",
- "½"
- ],
- [
- "ç«",
- "ŀäºī"
- ],
- [
- "Ġappro",
- "priate"
- ],
- [
- "æĽ´",
- "好çļĦ"
- ],
- [
- "Ġcharact",
- "er"
- ],
- [
- "c",
- "l"
- ],
- [
- "ç§",
- "ĺ"
- ],
- [
- "it",
- "ude"
- ],
- [
- "Ġte",
- "ac"
- ],
- [
- "le",
- "ep"
- ],
- [
- "ĠDe",
- "velop"
- ],
- [
- "in",
- "ce"
- ],
- [
- "å·",
- "¦"
- ],
- [
- "g",
- "round"
- ],
- [
- "è¡Į",
- "ä¸ļ"
- ],
- [
- "éĴĪ",
- "对"
- ],
- [
- "å¿ħ",
- "è¦ģ"
- ],
- [
- "Ġdet",
- "erm"
- ],
- [
- "--------",
- "--------"
- ],
- [
- "Ġst",
- "reng"
- ],
- [
- "d",
- "o"
- ],
- [
- "Ġchalleng",
- "ing"
- ],
- [
- "or",
- "k"
- ],
- [
- "Ġan",
- "x"
- ],
- [
- "èī²",
- "çļĦ"
- ],
- [
- "Ġh",
- "ard"
- ],
- [
- "æĺİ",
- "ç¡®"
- ],
- [
- "åĪĨ",
- "享"
- ],
- [
- "æĶ¹",
- "åıĺ"
- ],
- [
- "ä½",
- "³"
- ],
- [
- "åıª",
- "æľī"
- ],
- [
- "å±ķ",
- "示"
- ],
- [
- "Ġcam",
- "p"
- ],
- [
- "çº",
- "³"
- ],
- [
- "a",
- "j"
- ],
- [
- "et",
- "ic"
- ],
- [
- "u",
- "ment"
- ],
- [
- "ä½ł",
- "åı¯ä»¥"
- ],
- [
- "Ġpol",
- "lut"
- ],
- [
- "Ġh",
- "ig"
- ],
- [
- "pp",
- "ing"
- ],
- [
- "e",
- "ad"
- ],
- [
- "çĦ¶",
- "èĢĮ"
- ],
- [
- "第",
- "äºĮ"
- ],
- [
- "é¸",
- "Ł"
- ],
- [
- "çī©",
- "åĵģ"
- ],
- [
- "ä¸",
- "¾"
- ],
- [
- "Ġencoura",
- "ge"
- ],
- [
- "pe",
- "cial"
- ],
- [
- "Ġac",
- "ross"
- ],
- [
- "el",
- "ves"
- ],
- [
- "äºĭ",
- "ä»¶"
- ],
- [
- "c",
- "le"
- ],
- [
- "æ",
- "©"
- ],
- [
- "åªĴ",
- "ä½ĵ"
- ],
- [
- "n",
- "ers"
- ],
- [
- "Ġc",
- "al"
- ],
- [
- "èϽ",
- "çĦ¶"
- ],
- [
- "åĽ",
- "º"
- ],
- [
- "ä¹ł",
- "æĥ¯"
- ],
- [
- "Ġsaf",
- "e"
- ],
- [
- "èĥ½",
- "éĩı"
- ],
- [
- "ist",
- "ics"
- ],
- [
- "ä¹ĭ",
- "åīį"
- ],
- [
- "Ġiss",
- "ue"
- ],
- [
- "å¤ļ",
- "个"
- ],
- [
- "åĨ³",
- "çŃĸ"
- ],
- [
- "è¾¾",
- "åΰ"
- ],
- [
- "æĹ",
- "©"
- ],
- [
- "ä¸į",
- "åı¯"
- ],
- [
- "ä¸Ģ",
- "缴"
- ],
- [
- "å·",
- "¨"
- ],
- [
- "æĦŁ",
- "è°¢"
- ],
- [
- "ĠN",
- "ew"
- ],
- [
- "ä¸Ģ",
- "段"
- ],
- [
- "Ġmach",
- "ines"
- ],
- [
- "å°Ĩ",
- "åħ¶"
- ],
- [
- "ç»§",
- "ç»Ń"
- ],
- [
- "Ġwor",
- "d"
- ],
- [
- "çī¹",
- "åĪ«"
- ],
- [
- "Ġagricult",
- "ure"
- ],
- [
- "æĢ",
- "İ"
- ],
- [
- "éĢIJ",
- "æ¸IJ"
- ],
- [
- "éĵ",
- "¾"
- ],
- [
- "è¯",
- "¾"
- ],
- [
- "Ġk",
- "ind"
- ],
- [
- "å¢",
- "Ļ"
- ],
- [
- "è°¢",
- "è°¢"
- ],
- [
- "Ġalgorith",
- "m"
- ],
- [
- "è£ħ",
- "饰"
- ],
- [
- "Ġal",
- "ong"
- ],
- [
- "Ġeas",
- "y"
- ],
- [
- "äº",
- "ij"
- ],
- [
- "è§£åĨ³",
- "æĸ¹æ¡Ī"
- ],
- [
- "Ġaware",
- "ness"
- ],
- [
- "'",
- "ve"
- ],
- [
- "æĸ¹",
- "åIJij"
- ],
- [
- "Ġne",
- "ver"
- ],
- [
- "Ġquick",
- "ly"
- ],
- [
- "Ġres",
- "pect"
- ],
- [
- "çļĦæ",
- "Ļ"
- ],
- [
- "Ġam",
- "ong"
- ],
- [
- "Ġaccount",
- "ability"
- ],
- [
- "Ġl",
- "aw"
- ],
- [
- "en",
- "ing"
- ],
- [
- "Ġdef",
- "in"
- ],
- [
- "Ġsur",
- "round"
- ],
- [
- "éĵ",
- "ģ"
- ],
- [
- "Ġpower",
- "ful"
- ],
- [
- "A",
- "n"
- ],
- [
- "Ġcaus",
- "e"
- ],
- [
- "æ",
- "¥"
- ],
- [
- "æİĮ",
- "æı¡"
- ],
- [
- "è¿ĺ",
- "æĺ¯"
- ],
- [
- "Ġcreat",
- "ive"
- ],
- [
- "è¡",
- "Ģ"
- ],
- [
- "Ġloc",
- "ated"
- ],
- [
- "un",
- "ning"
- ],
- [
- "åľ°",
- "åĮº"
- ],
- [
- "éĿ¢",
- "积"
- ],
- [
- "éĽ",
- "¨"
- ],
- [
- "Ġne",
- "ar"
- ],
- [
- "Ġinit",
- "i"
- ],
- [
- "ress",
- "ion"
- ],
- [
- "ä¸ĭ",
- "æĿ¥"
- ],
- [
- "2",
- "5"
- ],
- [
- "é©",
- "¶"
- ],
- [
- "¾",
- "çĹħ"
- ],
- [
- "ab",
- "les"
- ],
- [
- "æľī",
- "è¶£"
- ],
- [
- "循",
- "çݯ"
- ],
- [
- "çŃĶ",
- "æ¡Ī"
- ],
- [
- "çł",
- "´"
- ],
- [
- "ic",
- "ation"
- ],
- [
- "éĻ",
- "¢"
- ],
- [
- "æ²»",
- "çĸĹ"
- ],
- [
- "Ġad",
- "dition"
- ],
- [
- "äºĭ",
- "æĥħ"
- ],
- [
- "Ġbec",
- "ause"
- ],
- [
- "åı",
- "Ī"
- ],
- [
- "èĤ",
- "Į"
- ],
- [
- "çº",
- "ª"
- ],
- [
- "s",
- "ide"
- ],
- [
- "æĭ",
- "ħ"
- ],
- [
- "æ¹",
- "¿"
- ],
- [
- "åį",
- "Ĭ"
- ],
- [
- "é¡",
- "º"
- ],
- [
- "ĠA",
- "nd"
- ],
- [
- "Ġrestaur",
- "ant"
- ],
- [
- "Ġv",
- "ide"
- ],
- [
- "Ġproble",
- "m"
- ],
- [
- "az",
- "ing"
- ],
- [
- "Ġmem",
- "bers"
- ],
- [
- "Ġn",
- "ut"
- ],
- [
- "Ġc",
- "ou"
- ],
- [
- "æµ",
- "ª"
- ],
- [
- "Ġ",
- "è¿Ļ"
- ],
- [
- "Ġhelp",
- "ing"
- ],
- [
- "ĠI",
- "s"
- ],
- [
- "æıIJ",
- "åįĩ"
- ],
- [
- "ĠĠĠĠ",
- "ĠĠ"
- ],
- [
- "Ġsh",
- "o"
- ],
- [
- "Ġre",
- "lev"
- ],
- [
- "Ġar",
- "g"
- ],
- [
- "Ġbal",
- "ance"
- ],
- [
- "ill",
- "ed"
- ],
- [
- "æĺ¯",
- "ä»Ģä¹Ī"
- ],
- [
- "åĬĽ",
- "éĩı"
- ],
- [
- "ire",
- "d"
- ],
- [
- "å¤",
- "ľ"
- ],
- [
- "åı¯",
- "æĮģç»Ń"
- ],
- [
- "Ġper",
- "fect"
- ],
- [
- "*",
- "*"
- ],
- [
- "ific",
- "ation"
- ],
- [
- "æ¶",
- "ī"
- ],
- [
- "Ġwild",
- "life"
- ],
- [
- "an",
- "e"
- ],
- [
- "Ġrel",
- "ated"
- ],
- [
- "室",
- "åĨħ"
- ],
- [
- "åº",
- "ľ"
- ],
- [
- "享",
- "åıĹ"
- ],
- [
- "our",
- "s"
- ],
- [
- "è·",
- "ij"
- ],
- [
- "åķĨ",
- "ä¸ļ"
- ],
- [
- "ach",
- "ing"
- ],
- [
- "Ġsu",
- "n"
- ],
- [
- "Ġrecogn",
- "ition"
- ],
- [
- "el",
- "t"
- ],
- [
- "Ġor",
- "der"
- ],
- [
- "å¹³",
- "åĿĩ"
- ],
- [
- "g",
- "ing"
- ],
- [
- "ä¸",
- "´"
- ],
- [
- "çĤ",
- "¼"
- ],
- [
- "Ġgo",
- "ing"
- ],
- [
- "åij¼",
- "åIJ¸"
- ],
- [
- "Ġsoft",
- "ware"
- ],
- [
- "Ġre",
- "mot"
- ],
- [
- "èijĹ",
- "åIJį"
- ],
- [
- "幸",
- "ç¦ı"
- ],
- [
- "Ġenh",
- "ance"
- ],
- [
- "èĻ",
- "ļ"
- ],
- [
- "Ġn",
- "ow"
- ],
- [
- "Ġth",
- "reat"
- ],
- [
- "Ġd",
- "est"
- ],
- [
- "åĿĩ",
- "åĮĢ"
- ],
- [
- "Ġac",
- "ad"
- ],
- [
- "åºĶ",
- "对"
- ],
- [
- "çľĭ",
- "åΰ"
- ],
- [
- "c",
- "ast"
- ],
- [
- "è¾",
- "Ĩ"
- ],
- [
- "ific",
- "ial"
- ],
- [
- "Ġ",
- "very"
- ],
- [
- "o",
- "ok"
- ],
- [
- "åĮº",
- "åŁŁ"
- ],
- [
- "¹",
- "ģ"
- ],
- [
- "æĪ¿",
- "éĹ´"
- ],
- [
- "æıIJä¾Ľ",
- "äºĨ"
- ],
- [
- "Ġmot",
- "iv"
- ],
- [
- "Ġaccess",
- "ible"
- ],
- [
- "åĨ³",
- "å®ļ"
- ],
- [
- "Ġh",
- "y"
- ],
- [
- "å®",
- "Ī"
- ],
- [
- "Ġf",
- "lo"
- ],
- [
- "u",
- "g"
- ],
- [
- "Ġinform",
- "ed"
- ],
- [
- "åĵģ",
- "è´¨"
- ],
- [
- "çļĦç",
- "Ł"
- ],
- [
- "av",
- "es"
- ],
- [
- "ar",
- "r"
- ],
- [
- "ĠW",
- "ith"
- ],
- [
- "le",
- "t"
- ],
- [
- "è§Ĥ",
- "çĤ¹"
- ],
- [
- "en",
- "ge"
- ],
- [
- "è¡Į",
- "åĬ¨"
- ],
- [
- "f",
- "riend"
- ],
- [
- "ç³",
- "ķ"
- ],
- [
- "Ġf",
- "urther"
- ],
- [
- "ĠE",
- "ns"
- ],
- [
- "ç§",
- "ģ"
- ],
- [
- "Ġad",
- "o"
- ],
- [
- "Ġcle",
- "an"
- ],
- [
- "缸",
- "åºĶ"
- ],
- [
- "Ġf",
- "re"
- ],
- [
- "pecial",
- "ly"
- ],
- [
- "è",
- "Ĺ"
- ],
- [
- "Ġc",
- "apt"
- ],
- [
- "çļĦç",
- "ľ"
- ],
- [
- "Ġsome",
- "one"
- ],
- [
- "Ġc",
- "ell"
- ],
- [
- "æĶ¾",
- "åľ¨"
- ],
- [
- "欢",
- "è¿İ"
- ],
- [
- "Ġ",
- "âĢ"
- ],
- [
- "Ġdev",
- "ices"
- ],
- [
- "çļĦ",
- "æĸ¹å¼ı"
- ],
- [
- "Ġjob",
- "s"
- ],
- [
- "au",
- "gh"
- ],
- [
- "n",
- "ot"
- ],
- [
- "æľī",
- "äºĽ"
- ],
- [
- "åħ¬",
- "åħ±"
- ],
- [
- "g",
- "est"
- ],
- [
- "çļĦ",
- "çĶŁæ´»"
- ],
- [
- "çľ",
- "¼"
- ],
- [
- "çļĦ",
- "ä¿¡æģ¯"
- ],
- [
- "ĠC",
- "ons"
- ],
- [
- "æİĴ",
- "åºı"
- ],
- [
- "Ġbenef",
- "it"
- ],
- [
- "re",
- "ct"
- ],
- [
- "å¤",
- "ı"
- ],
- [
- "un",
- "te"
- ],
- [
- "符",
- "åIJĪ"
- ],
- [
- "ä¸Ģ",
- "ä½į"
- ],
- [
- "åĨħ",
- "éĥ¨"
- ],
- [
- "Ġlook",
- "ing"
- ],
- [
- "d",
- "ing"
- ],
- [
- "æĬ",
- "ĺ"
- ],
- [
- "è¾",
- "ij"
- ],
- [
- "è¿Ļ个",
- "éĹ®é¢ĺ"
- ],
- [
- "Ġes",
- "pecially"
- ],
- [
- "çľ",
- "ł"
- ],
- [
- "âĢĿ",
- "ãĢĤ"
- ],
- [
- "å¥",
- "ı"
- ],
- [
- "ra",
- "y"
- ],
- [
- "è¿ĺ",
- "åı¯ä»¥"
- ],
- [
- "åĪĽ",
- "ä½ľ"
- ],
- [
- "com",
- "ing"
- ],
- [
- "Ġmulti",
- "ple"
- ],
- [
- "éļ",
- "IJ"
- ],
- [
- "æ³",
- "¡"
- ],
- [
- "æłĩ",
- "åĩĨ"
- ],
- [
- "Ġm",
- "il"
- ],
- [
- "éľĢè¦ģ",
- "注æĦı"
- ],
- [
- "Ġanx",
- "iety"
- ],
- [
- "æĶ¹",
- "è¿Ľ"
- ],
- [
- "å±",
- "ĭ"
- ],
- [
- "污",
- "æŁĵ"
- ],
- [
- "ç¼ĸ",
- "ç¨ĭ"
- ],
- [
- "è´¹",
- "ç͍"
- ],
- [
- "Ġev",
- "alu"
- ],
- [
- "imate",
- "ly"
- ],
- [
- "Ġlit",
- "er"
- ],
- [
- "ogra",
- "ph"
- ],
- [
- "Ġse",
- "arch"
- ],
- [
- "1",
- "6"
- ],
- [
- "en",
- "ced"
- ],
- [
- "Ġmeth",
- "ods"
- ],
- [
- "çĥ",
- "Ī"
- ],
- [
- "模",
- "å¼ı"
- ],
- [
- "çĬ¶",
- "åĨµ"
- ],
- [
- "æĶ¹",
- "åĸĦ"
- ],
- [
- "å¤ļ",
- "æł·"
- ],
- [
- "c",
- "er"
- ],
- [
- "å¥",
- "ĸ"
- ],
- [
- "Ġsat",
- "is"
- ],
- [
- "Ġwebs",
- "ite"
- ],
- [
- "åĬ",
- "ŀ"
- ],
- [
- "åģ¥",
- "身"
- ],
- [
- "Ġglo",
- "bal"
- ],
- [
- "Ġas",
- "k"
- ],
- [
- "Ġplatform",
- "s"
- ],
- [
- "Ġdise",
- "ases"
- ],
- [
- "çݰ",
- "象"
- ],
- [
- "t",
- "ics"
- ],
- [
- "æ±",
- "ģ"
- ],
- [
- "åΤ",
- "æĸŃ"
- ],
- [
- "Ġcon",
- "vers"
- ],
- [
- "Ġrelations",
- "hip"
- ],
- [
- "设",
- "ç½®"
- ],
- [
- "æ³ķ",
- "å¾ĭ"
- ],
- [
- "Ġmind",
- "ful"
- ],
- [
- "é¢Ħ",
- "æµĭ"
- ],
- [
- "o",
- "very"
- ],
- [
- "åģ",
- "ľ"
- ],
- [
- "ç͵",
- "è§Ĩ"
- ],
- [
- "è§Ħ",
- "åĪĻ"
- ],
- [
- "ak",
- "en"
- ],
- [
- "Ġimplement",
- "ing"
- ],
- [
- "is",
- "ing"
- ],
- [
- "åıĤ",
- "åĬł"
- ],
- [
- "æĥħ",
- "绪"
- ],
- [
- "Ġprovid",
- "ed"
- ],
- [
- "æ·±",
- "åħ¥"
- ],
- [
- "Ġprogramm",
- "ed"
- ],
- [
- "Ġrelev",
- "ant"
- ],
- [
- "çļĦç",
- "ĥ"
- ],
- [
- "çĸ",
- "¾çĹħ"
- ],
- [
- "åĮ»",
- "çĶŁ"
- ],
- [
- "åĪĽ",
- "建"
- ],
- [
- "Ġgener",
- "ate"
- ],
- [
- "æĶ¶",
- "åħ¥"
- ],
- [
- "ä¼",
- "ij"
- ],
- [
- "iz",
- "es"
- ],
- [
- "Ġtrans",
- "form"
- ],
- [
- "éģ",
- "µ"
- ],
- [
- "ast",
- "ic"
- ],
- [
- "åij",
- "Ī"
- ],
- [
- "æ¯ı",
- "个人"
- ],
- [
- "è¿",
- "Ķ"
- ],
- [
- "i",
- "et"
- ],
- [
- "Ġv",
- "oice"
- ],
- [
- "éĢ",
- "Ķ"
- ],
- [
- "æĶ¾",
- "æĿ¾"
- ],
- [
- "åį",
- "´"
- ],
- [
- "èĥ",
- "ľ"
- ],
- [
- "Ġst",
- "ructure"
- ],
- [
- "æĹ¶",
- "å°ļ"
- ],
- [
- "Ġ",
- "Q"
- ],
- [
- "Ġel",
- "se"
- ],
- [
- "du",
- "c"
- ],
- [
- "Ġem",
- "p"
- ],
- [
- "èģ",
- "ļ"
- ],
- [
- "è´",
- "§"
- ],
- [
- "ac",
- "hes"
- ],
- [
- "ç§",
- "Ģ"
- ],
- [
- "an",
- "ks"
- ],
- [
- "Ġn",
- "ight"
- ],
- [
- "Ġprofessional",
- "s"
- ],
- [
- "Ġb",
- "as"
- ],
- [
- "è´",
- "µ"
- ],
- [
- "e",
- "c"
- ],
- [
- "Ġdivers",
- "ity"
- ],
- [
- "it",
- "es"
- ],
- [
- "d",
- "r"
- ],
- [
- "åĽ°",
- "éļ¾"
- ],
- [
- "ĥ",
- "åľ"
- ],
- [
- "åŀ",
- "ĥåľ"
- ],
- [
- "åŀĥåľ",
- "¾"
- ],
- [
- "Ġd",
- "rug"
- ],
- [
- "ç¢",
- "³"
- ],
- [
- "Ġn",
- "ame"
- ],
- [
- "åĮĸ",
- "çļĦ"
- ],
- [
- "a",
- "id"
- ],
- [
- "æľĢ",
- "大"
- ],
- [
- "æij",
- "Ħ"
- ],
- [
- "ç®Ģåįķ",
- "çļĦ"
- ],
- [
- "Ġw",
- "arm"
- ],
- [
- "Ġd",
- "one"
- ],
- [
- "Ġfun",
- "ction"
- ],
- [
- "as",
- "c"
- ],
- [
- "强",
- "è°ĥ"
- ],
- [
- "Ġdem",
- "and"
- ],
- [
- "Ġvis",
- "ual"
- ],
- [
- "Ġup",
- "d"
- ],
- [
- "æŃ£",
- "åľ¨"
- ],
- [
- "Ġsim",
- "ilar"
- ],
- [
- "éĢ",
- "Ĵ"
- ],
- [
- "æ¯",
- "Ľ"
- ],
- [
- "éĶ",
- "»"
- ],
- [
- "ent",
- "ly"
- ],
- [
- "Ġvalu",
- "able"
- ],
- [
- "Ġdis",
- "aster"
- ],
- [
- "ä¸Ģ",
- "èά"
- ],
- [
- "æ´",
- "²"
- ],
- [
- "ĠR",
- "eg"
- ],
- [
- "Ġdiscrim",
- "ination"
- ],
- [
- "åĨĻ",
- "ä¸Ģç¯ĩ"
- ],
- [
- "Ġgovern",
- "ment"
- ],
- [
- "Ġ",
- "好çļĦ"
- ],
- [
- "5",
- "00"
- ],
- [
- "ly",
- "ing"
- ],
- [
- "Ġpre",
- "v"
- ],
- [
- "Ġpre",
- "pare"
- ],
- [
- "Ġproble",
- "ms"
- ],
- [
- "è·",
- "³"
- ],
- [
- "Ġpro",
- "m"
- ],
- [
- "åĨ",
- "²"
- ],
- [
- "å®ī",
- "è£ħ"
- ],
- [
- "éĶ»",
- "çĤ¼"
- ],
- [
- "æµ",
- "ĵ"
- ],
- [
- "è",
- "¹"
- ],
- [
- "åºĶç͍",
- "ç¨ĭåºı"
- ],
- [
- "n",
- "g"
- ],
- [
- "Ġcomp",
- "et"
- ],
- [
- "åĪĨ",
- "åĪ«"
- ],
- [
- "olo",
- "gical"
- ],
- [
- "å®",
- "¡"
- ],
- [
- "Ġtrans",
- "l"
- ],
- [
- "Ġdire",
- "ct"
- ],
- [
- "åī",
- "Ĥ"
- ],
- [
- "Ġsuggest",
- "ions"
- ],
- [
- "Ġpap",
- "er"
- ],
- [
- "Ġrecogn",
- "ize"
- ],
- [
- "t",
- "on"
- ],
- [
- "Ġmit",
- "igate"
- ],
- [
- "讨",
- "论"
- ],
- [
- "äºĴ",
- "åĬ¨"
- ],
- [
- "ĠE",
- "ar"
- ],
- [
- "Ġam",
- "azing"
- ],
- [
- "c",
- "re"
- ],
- [
- "é¦",
- "Ī"
- ],
- [
- "Ġinvol",
- "ved"
- ],
- [
- "f",
- "ace"
- ],
- [
- "æľī",
- "åħ³"
- ],
- [
- ")",
- ")"
- ],
- [
- "Ġex",
- "ce"
- ],
- [
- "Ġproduct",
- "ivity"
- ],
- [
- "è",
- "Ń"
- ],
- [
- "é¦",
- "Ĩ"
- ],
- [
- "Ġsound",
- "s"
- ],
- [
- "Ġidentify",
- "ing"
- ],
- [
- "]",
- ","
- ],
- [
- "é¾",
- "Ļ"
- ],
- [
- "Ġf",
- "it"
- ],
- [
- "Ġcontribut",
- "e"
- ],
- [
- "th",
- "s"
- ],
- [
- "friend",
- "ly"
- ],
- [
- "e",
- "le"
- ],
- [
- "if",
- "ied"
- ],
- [
- "iven",
- "ess"
- ],
- [
- "ite",
- "ly"
- ],
- [
- "Ġ",
- "X"
- ],
- [
- "Ġl",
- "ed"
- ],
- [
- "åĿ",
- "ı"
- ],
- [
- "Ġhist",
- "or"
- ],
- [
- "Ġd",
- "at"
- ],
- [
- "Ġjour",
- "ney"
- ],
- [
- "Ġ",
- "}"
- ],
- [
- "Ġse",
- "lect"
- ],
- [
- "æ¼",
- "«"
- ],
- [
- "Ġcon",
- "duct"
- ],
- [
- "è¿Ľ",
- "ä¸ĢæŃ¥"
- ],
- [
- "ç»Ļ",
- "æĪij"
- ],
- [
- "Ġl",
- "if"
- ],
- [
- "è£ħ",
- "ä¿®"
- ],
- [
- "为",
- "ä»Ģä¹Ī"
- ],
- [
- "äº",
- "¬"
- ],
- [
- "Ġn",
- "av"
- ],
- [
- "Ġwho",
- "le"
- ],
- [
- "ç",
- "¹ģ"
- ],
- [
- "åĨ",
- "ľ"
- ],
- [
- "æĶ",
- "»"
- ],
- [
- "Ġb",
- "reat"
- ],
- [
- "Ġm",
- "iss"
- ],
- [
- "é¾",
- "Ħ"
- ],
- [
- "t",
- "t"
- ],
- [
- "s",
- "w"
- ],
- [
- "Ġb",
- "ar"
- ],
- [
- "请",
- "éĹ®"
- ],
- [
- "èģĶ",
- "ç½ij"
- ],
- [
- "Ġatt",
- "ract"
- ],
- [
- "æĤ¨",
- "åı¯ä»¥"
- ],
- [
- "O",
- "ne"
- ],
- [
- "åħħ",
- "åĪĨ"
- ],
- [
- "r",
- "ing"
- ],
- [
- "Ġå½ĵ",
- "çĦ¶"
- ],
- [
- "re",
- "am"
- ],
- [
- "Ġev",
- "ol"
- ],
- [
- "Ġs",
- "n"
- ],
- [
- "ĠE",
- "m"
- ],
- [
- "m",
- "osp"
- ],
- [
- "Ġcho",
- "ose"
- ],
- [
- "v",
- "iew"
- ],
- [
- "Ġar",
- "r"
- ],
- [
- "Ġs",
- "leep"
- ],
- [
- "end",
- "ed"
- ],
- [
- "æŀ",
- "¶"
- ],
- [
- "Ġveh",
- "icles"
- ],
- [
- "Ġf",
- "resh"
- ],
- [
- "Ġorganiz",
- "ation"
- ],
- [
- "è¿Ļ",
- "段"
- ],
- [
- "æ±",
- "¤"
- ],
- [
- "ĠI",
- "nt"
- ],
- [
- "Ġcont",
- "ext"
- ],
- [
- "åı¦",
- "å¤ĸ"
- ],
- [
- "Ġoce",
- "an"
- ],
- [
- "æĦŁ",
- "åıĹ"
- ],
- [
- "Ġpollut",
- "ion"
- ],
- [
- "ur",
- "b"
- ],
- [
- "æī§",
- "è¡Į"
- ],
- [
- "erson",
- "al"
- ],
- [
- "ĠHe",
- "alth"
- ],
- [
- "ä¼ĺ",
- "çĤ¹"
- ],
- [
- "Ġatt",
- "ention"
- ],
- [
- "æľī",
- "çĿĢ"
- ],
- [
- "é£Ł",
- "æĿIJ"
- ],
- [
- "Ġer",
- "r"
- ],
- [
- "çļĦæĿ",
- "¥"
- ],
- [
- "çļĦç",
- "Ī"
- ],
- [
- "èŃ",
- "¦"
- ],
- [
- "è·",
- "Ł"
- ],
- [
- "æĹħ",
- "è¡Į"
- ],
- [
- "èĴ",
- "ľ"
- ],
- [
- "çļĦæĢ",
- "Ŀ"
- ],
- [
- "Ġchat",
- "bot"
- ],
- [
- "çļĦ",
- "éľĢæ±Ĥ"
- ],
- [
- "çķ",
- "¥"
- ],
- [
- "Ġfeel",
- "ing"
- ],
- [
- "Ġimplement",
- "ed"
- ],
- [
- "社",
- "åĮº"
- ],
- [
- "çļĦ",
- "建议"
- ],
- [
- "æIJ",
- "ħ"
- ],
- [
- "éĹ",
- "»"
- ],
- [
- "åıį",
- "é¦Ī"
- ],
- [
- "缴",
- "æİ¥"
- ],
- [
- "æĺ",
- "¥"
- ],
- [
- "it",
- "able"
- ],
- [
- "æĪij",
- "ä¼ļ"
- ],
- [
- "åį",
- "±"
- ],
- [
- "èī¯",
- "好"
- ],
- [
- "Ġl",
- "iving"
- ],
- [
- "åıĺ",
- "éĩı"
- ],
- [
- "ĠB",
- "ut"
- ],
- [
- "Ġcomple",
- "te"
- ],
- [
- "Ġtre",
- "nds"
- ],
- [
- "Ġm",
- "akes"
- ],
- [
- "ä»Ĭ",
- "天"
- ],
- [
- "Ġdist",
- "ribut"
- ],
- [
- "Ġcomm",
- "it"
- ],
- [
- "Ġat",
- "mosp"
- ],
- [
- "ä¼",
- "´"
- ],
- [
- "Ġsens",
- "ors"
- ],
- [
- "Ġs",
- "w"
- ],
- [
- "æĹł",
- "论"
- ],
- [
- "om",
- "en"
- ],
- [
- "æĶ¿",
- "åºľ"
- ],
- [
- "Ġchall",
- "enge"
- ],
- [
- "Ġt",
- "urn"
- ],
- [
- "çIJĨ",
- "论"
- ],
- [
- "p",
- "ar"
- ],
- [
- "Ġwrit",
- "e"
- ],
- [
- "ç»ı",
- "åħ¸"
- ],
- [
- "em",
- "ember"
- ],
- [
- "é¥",
- "Ń"
- ],
- [
- "æĸ¹",
- "便"
- ],
- [
- "Ġc",
- "u"
- ],
- [
- "Ġval",
- "ue"
- ],
- [
- "Ġf",
- "und"
- ],
- [
- "p",
- "ose"
- ],
- [
- "è°ĥ",
- "æŁ¥"
- ],
- [
- "çĿ",
- "¡"
- ],
- [
- "Ġcommunic",
- "ate"
- ],
- [
- "Ġdise",
- "ase"
- ],
- [
- "Ġrese",
- "arc"
- ],
- [
- "Ġl",
- "ack"
- ],
- [
- "arn",
- "ing"
- ],
- [
- "ĠP",
- "ark"
- ],
- [
- "çĦ",
- "¦"
- ],
- [
- "é«ĺ",
- "度"
- ],
- [
- "Ġr",
- "ather"
- ],
- [
- "å®",
- "£"
- ],
- [
- "çĪ",
- "¶"
- ],
- [
- "éĺ",
- "¶"
- ],
- [
- "è®",
- "¢"
- ],
- [
- "çĥ",
- "§"
- ],
- [
- "Ġhig",
- "her"
- ],
- [
- "Ġsumm",
- "ary"
- ],
- [
- "ĠA",
- "ut"
- ],
- [
- "çļĦæ",
- "³"
- ],
- [
- "Ġe",
- "le"
- ],
- [
- "is",
- "ms"
- ],
- [
- "Ġrel",
- "i"
- ],
- [
- "ä¹Ł",
- "ä¼ļ"
- ],
- [
- "f",
- "ra"
- ],
- [
- "åijĬè¯ī",
- "æĪij"
- ],
- [
- "æĬ",
- "½"
- ],
- [
- "Ġsitu",
- "ations"
- ],
- [
- "Ġmar",
- "ine"
- ],
- [
- "æĥ³",
- "è¦ģ"
- ],
- [
- "in",
- "ci"
- ],
- [
- "in",
- "al"
- ],
- [
- "Ġg",
- "ain"
- ],
- [
- "Ġdiffere",
- "nce"
- ],
- [
- "æľºåύ",
- "人"
- ],
- [
- "æµģ",
- "ç¨ĭ"
- ],
- [
- "ĠC",
- "hat"
- ],
- [
- "ç½ij",
- "ç«Ļ"
- ],
- [
- "æľ",
- "«"
- ],
- [
- "Ġcol",
- "or"
- ],
- [
- "Ġas",
- "pect"
- ],
- [
- "ç½",
- "Ĺ"
- ],
- [
- "ĠE",
- "duc"
- ],
- [
- "Ġde",
- "ploy"
- ],
- [
- "Ġbeaut",
- "y"
- ],
- [
- "æĤ",
- "£"
- ],
- [
- "ruct",
- "ion"
- ],
- [
- "it",
- "ut"
- ],
- [
- "æĿ",
- "Ł"
- ],
- [
- "让",
- "æĪij们"
- ],
- [
- "éķ¿",
- "度"
- ],
- [
- "ul",
- "es"
- ],
- [
- "æ¶ī",
- "åıĬ"
- ],
- [
- "Ġdig",
- "ital"
- ],
- [
- "Ġexist",
- "ing"
- ],
- [
- "ĠO",
- "r"
- ],
- [
- "\\_",
- "\\_"
- ],
- [
- "Ġback",
- "ground"
- ],
- [
- "çĹ",
- "ĩ"
- ],
- [
- "æ¯ı",
- "天"
- ],
- [
- "p",
- "ython"
- ],
- [
- "Ġfarm",
- "ers"
- ],
- [
- "Ġcontin",
- "u"
- ],
- [
- "\"",
- ":"
- ],
- [
- "Ġg",
- "iven"
- ],
- [
- "å°ı",
- "æĹ¶"
- ],
- [
- "Ġmom",
- "ent"
- ],
- [
- "2",
- "00"
- ],
- [
- "J",
- "ohn"
- ],
- [
- "éĿ¢",
- "对"
- ],
- [
- "Ġint",
- "ro"
- ],
- [
- "Ġtherap",
- "y"
- ],
- [
- "è¿Ķ",
- "åĽŀ"
- ],
- [
- "å¹¶",
- "åľ¨"
- ],
- [
- "Ġ",
- "z"
- ],
- [
- "Ġaff",
- "ord"
- ],
- [
- "ä¸",
- "Ŀ"
- ],
- [
- "å®",
- "½"
- ],
- [
- "Ġ",
- "Ã"
- ],
- [
- "ĠN",
- "ational"
- ],
- [
- "èĥ",
- "¡"
- ],
- [
- "Ġexercis",
- "e"
- ],
- [
- "æIJħ",
- "æĭĮ"
- ],
- [
- "æĶ¯",
- "ä»ĺ"
- ],
- [
- "éĺ³",
- "åħī"
- ],
- [
- "è¯",
- "ļ"
- ],
- [
- "Ġs",
- "ect"
- ],
- [
- "ĠS",
- "u"
- ],
- [
- "å¢ŀ",
- "éķ¿"
- ],
- [
- "ç¾İ",
- "丽"
- ],
- [
- "Ġw",
- "a"
- ],
- [
- "以ä¸ĭæĺ¯",
- "ä¸ĢäºĽ"
- ],
- [
- "èĽĭ",
- "ç³ķ"
- ],
- [
- "Ġ",
- "ill"
- ],
- [
- "æ¸ħ",
- "æĻ"
- ],
- [
- "et",
- "ry"
- ],
- [
- "æ¢",
- "¦"
- ],
- [
- "ç¾İ",
- "åĽ½"
- ],
- [
- "ä»",
- "į"
- ],
- [
- "one",
- "y"
- ],
- [
- "Ġecosystem",
- "s"
- ],
- [
- "æĮĩ",
- "导"
- ],
- [
- "d",
- "ef"
- ],
- [
- "9",
- "9"
- ],
- [
- "æŁ",
- "Ķ"
- ],
- [
- "pp",
- "ed"
- ],
- [
- "Ġlim",
- "it"
- ],
- [
- "çİ",
- "ī"
- ],
- [
- "Ġacad",
- "emic"
- ],
- [
- "Ġrestaur",
- "ants"
- ],
- [
- "Ġhe",
- "ad"
- ],
- [
- "ä¿¡",
- "ä»»"
- ],
- [
- "ast",
- "ers"
- ],
- [
- "å²",
- "ģ"
- ],
- [
- "ak",
- "ers"
- ],
- [
- "1",
- "4"
- ],
- [
- "A",
- "s"
- ],
- [
- "æł",
- "¡"
- ],
- [
- "é«ĺ",
- "æķĪ"
- ],
- [
- "ph",
- "as"
- ],
- [
- "y",
- "n"
- ],
- [
- "ç¨ĭ",
- "度"
- ],
- [
- "è¾",
- "£"
- ],
- [
- "ä¸Ĭ",
- "éĿ¢"
- ],
- [
- "å®¶",
- "å±ħ"
- ],
- [
- "ter",
- "m"
- ],
- [
- "ç¾İ",
- "é£Ł"
- ],
- [
- "Ġo",
- "vers"
- ],
- [
- "å®",
- "ĺ"
- ],
- [
- "Ġind",
- "ic"
- ],
- [
- "ĠY",
- "our"
- ],
- [
- "S",
- "t"
- ],
- [
- "å½¢",
- "象"
- ],
- [
- "è´",
- "¡"
- ],
- [
- "åº",
- "Ĭ"
- ],
- [
- "ĠS",
- "c"
- ],
- [
- "ag",
- "ra"
- ],
- [
- "羣",
- "æŃ£"
- ],
- [
- "o",
- "int"
- ],
- [
- "id",
- "s"
- ],
- [
- "are",
- "nt"
- ],
- [
- "éĵ",
- "¶"
- ],
- [
- "èģ",
- "Ĭ"
- ],
- [
- "Ġreg",
- "ular"
- ],
- [
- "ä¼ĺ",
- "ç§Ģ"
- ],
- [
- "Ġcol",
- "le"
- ],
- [
- "çĸ",
- "ij"
- ],
- [
- "Ġsub",
- "ject"
- ],
- [
- "Ġgreat",
- "er"
- ],
- [
- "Ġst",
- "ore"
- ],
- [
- "åŁ¹",
- "è®Ń"
- ],
- [
- "Ġim",
- "ag"
- ],
- [
- "Ġan",
- "sw"
- ],
- [
- "ä½",
- "Ļ"
- ],
- [
- "Ġsp",
- "ot"
- ],
- [
- "åĪĨ",
- "åŃIJ"
- ],
- [
- "Ġaud",
- "ience"
- ],
- [
- "p",
- "et"
- ],
- [
- "Ġv",
- "ers"
- ],
- [
- "Ġtra",
- "il"
- ],
- [
- "åĭ",
- "ĩ"
- ],
- [
- "er",
- "ous"
- ],
- [
- "Ġguid",
- "ance"
- ],
- [
- "Ġspe",
- "ech"
- ],
- [
- "åĵ",
- "²"
- ],
- [
- "æĺ¯",
- "çͱ"
- ],
- [
- "è´¡",
- "çĮ®"
- ],
- [
- "åIJĪéĢĤ",
- "çļĦ"
- ],
- [
- "设",
- "æĸ½"
- ],
- [
- "ä»ĸ",
- "人"
- ],
- [
- "ens",
- "ive"
- ],
- [
- "åĢ",
- "¾"
- ],
- [
- "al",
- "ing"
- ],
- [
- "Ġproject",
- "s"
- ],
- [
- "å",
- "³"
- ],
- [
- "Ġt",
- "akes"
- ],
- [
- "ç»",
- "©"
- ],
- [
- "T",
- "hat"
- ],
- [
- "Ġb",
- "ro"
- ],
- [
- "iv",
- "ed"
- ],
- [
- "Ġ",
- "&"
- ],
- [
- "åĿ",
- "IJ"
- ],
- [
- "place",
- "ment"
- ],
- [
- "è¿ŀ",
- "æİ¥"
- ],
- [
- "çļĦç¤",
- "¾"
- ],
- [
- "ĠT",
- "ra"
- ],
- [
- "Ġrel",
- "ax"
- ],
- [
- "u",
- "fact"
- ],
- [
- "éģ",
- "į"
- ],
- [
- "Ġsur",
- "v"
- ],
- [
- "åı£",
- "åij³"
- ],
- [
- "Ġcreat",
- "ivity"
- ],
- [
- "o",
- "f"
- ],
- [
- "å¨",
- "ģ"
- ],
- [
- "çļĦç",
- "ł"
- ],
- [
- "Ġbreat",
- "h"
- ],
- [
- "Ġpl",
- "aces"
- ],
- [
- "Ġdesc",
- "rib"
- ],
- [
- "èĭ±",
- "è¯Ń"
- ],
- [
- "Ġdam",
- "age"
- ],
- [
- "or",
- "ation"
- ],
- [
- "为",
- "æĤ¨"
- ],
- [
- "if",
- "t"
- ],
- [
- "Ġc",
- "ase"
- ],
- [
- "å¹´",
- "é¾Ħ"
- ],
- [
- "Ġp",
- "ress"
- ],
- [
- "çĶ",
- "ľ"
- ],
- [
- "éĩ",
- "İ"
- ],
- [
- "æĹħ",
- "游"
- ],
- [
- "Ġt",
- "aken"
- ],
- [
- "in",
- "ed"
- ],
- [
- "Ġcon",
- "cept"
- ],
- [
- "æĴ",
- "Ń"
- ],
- [
- "Ġinterest",
- "ing"
- ],
- [
- "è·",
- "µ"
- ],
- [
- "Ġse",
- "a"
- ],
- [
- "6",
- "0"
- ],
- [
- "Ġf",
- "oot"
- ],
- [
- "ĠN",
- "ame"
- ],
- [
- "Ġresearc",
- "hers"
- ],
- [
- "éĢ",
- "ģ"
- ],
- [
- "Ġwe",
- "e"
- ],
- [
- ")",
- ";"
- ],
- [
- "çļĦ",
- "åħ³éĶ®"
- ],
- [
- "ä¼",
- "½"
- ],
- [
- "ele",
- "br"
- ],
- [
- "å¡",
- "ij"
- ],
- [
- "W",
- "e"
- ],
- [
- "ç»ı",
- "常"
- ],
- [
- "Ġpopul",
- "ations"
- ],
- [
- "åħ¬",
- "å¼ı"
- ],
- [
- "or",
- "n"
- ],
- [
- "çĩ",
- "ĥ"
- ],
- [
- "人",
- "çĶŁ"
- ],
- [
- "1",
- "7"
- ],
- [
- "æİ¥",
- "åıĹ"
- ],
- [
- "Ġloc",
- "ation"
- ],
- [
- "Ġin",
- "equ"
- ],
- [
- "Ġinter",
- "vent"
- ],
- [
- "Ġinterest",
- "ed"
- ],
- [
- "Ġdefin",
- "itely"
- ],
- [
- "Ġassist",
- "ance"
- ],
- [
- "è¿Ļ",
- "ä¸Ģ"
- ],
- [
- "åIJĪ",
- "åIJĮ"
- ],
- [
- "ä¼ĺ",
- "åĬ¿"
- ],
- [
- "çļĦ",
- "å·¥ä½ľ"
- ],
- [
- "Ġ1",
- "2"
- ],
- [
- "Ġmo",
- "v"
- ],
- [
- "åģ",
- "ı"
- ],
- [
- "åŃĺ",
- "åĤ¨"
- ],
- [
- "us",
- "ive"
- ],
- [
- "æĹ",
- "ı"
- ],
- [
- "ï¼ī",
- "ï¼Į"
- ],
- [
- "Ġg",
- "as"
- ],
- [
- "Ġinterest",
- "s"
- ],
- [
- "æ¸ħæĻ",
- "°"
- ],
- [
- "Ġg",
- "ard"
- ],
- [
- "çĸ",
- "«"
- ],
- [
- "Ġs",
- "ay"
- ],
- [
- "å¤",
- "«"
- ],
- [
- "g",
- "es"
- ],
- [
- "èIJ",
- "¨"
- ],
- [
- "ä¸ļ",
- "åĬ¡"
- ],
- [
- "个",
- "æĢ§"
- ],
- [
- "åIJ",
- "¯"
- ],
- [
- "Ġeng",
- "agement"
- ],
- [
- "Ġb",
- "ig"
- ],
- [
- "éľĢè¦ģ",
- "èĢĥèĻij"
- ],
- [
- "Ġpr",
- "inci"
- ],
- [
- "åij¨",
- "åĽ´"
- ],
- [
- "Ġopportun",
- "ity"
- ],
- [
- "çģ",
- "¾"
- ],
- [
- "èĹ",
- "ı"
- ],
- [
- "re",
- "l"
- ],
- [
- "缺",
- "çĤ¹"
- ],
- [
- "Ġhapp",
- "y"
- ],
- [
- "åĴĮ",
- "åħ¶ä»ĸ"
- ],
- [
- "av",
- "a"
- ],
- [
- "Ġestab",
- "lish"
- ],
- [
- "鸡",
- "èĽĭ"
- ],
- [
- "i",
- "king"
- ],
- [
- "ĠT",
- "rans"
- ],
- [
- "rast",
- "ructure"
- ],
- [
- "fore",
- "st"
- ],
- [
- "èİ·",
- "åıĸ"
- ],
- [
- "èĦ",
- "ļ"
- ],
- [
- "in",
- "ally"
- ],
- [
- "èµ",
- "ı"
- ],
- [
- "Ġdel",
- "icious"
- ],
- [
- "Ġresult",
- "s"
- ],
- [
- "è§Ĥ",
- "å¯Ł"
- ],
- [
- "å®ŀ",
- "è·µ"
- ],
- [
- "Ġl",
- "ast"
- ],
- [
- "Ġpol",
- "it"
- ],
- [
- "æĢ§",
- "èĥ½"
- ],
- [
- "F",
- "or"
- ],
- [
- "b",
- "i"
- ],
- [
- "缸",
- "ä¿¡"
- ],
- [
- "ff",
- "ee"
- ],
- [
- "Ġph",
- "r"
- ],
- [
- "Ġfore",
- "st"
- ],
- [
- "ell",
- "ing"
- ],
- [
- "æµģ",
- "è¡Į"
- ],
- [
- "at",
- "ic"
- ],
- [
- "大",
- "å®¶"
- ],
- [
- "ĠIn",
- "st"
- ],
- [
- "æķ°",
- "åѦ"
- ],
- [
- "æī",
- "©"
- ],
- [
- "å®Į",
- "åħ¨"
- ],
- [
- "å¼ķ",
- "èµ·"
- ],
- [
- "es",
- "e"
- ],
- [
- "转",
- "æį¢"
- ],
- [
- "Ġaffect",
- "ed"
- ],
- [
- "Ġrobot",
- "ics"
- ],
- [
- "综",
- "ä¸Ĭ"
- ],
- [
- "Ġpro",
- "p"
- ],
- [
- "让",
- "人"
- ],
- [
- "æ²",
- "³"
- ],
- [
- "ä¸Ń",
- "æľĢ"
- ],
- [
- "Ġaut",
- "onomous"
- ],
- [
- "Ġha",
- "ving"
- ],
- [
- "Ġtri",
- "p"
- ],
- [
- "ur",
- "y"
- ],
- [
- "Ġbi",
- "ased"
- ],
- [
- "Ġconsider",
- "ations"
- ],
- [
- "Ġpartic",
- "ular"
- ],
- [
- "åį",
- "ł"
- ],
- [
- "æİ¨",
- "广"
- ],
- [
- "Ġiniti",
- "atives"
- ],
- [
- "ial",
- "s"
- ],
- [
- "åij³",
- "éģĵ"
- ],
- [
- "Ġtreat",
- "ments"
- ],
- [
- "Ġem",
- "phas"
- ],
- [
- "çĭ¬çī¹",
- "çļĦ"
- ],
- [
- "Ġl",
- "ay"
- ],
- [
- "æĶ¿",
- "çŃĸ"
- ],
- [
- "æĢİ",
- "ä¹Ī"
- ],
- [
- "ron",
- "ic"
- ],
- [
- "pl",
- "ay"
- ],
- [
- "Ġco",
- "ok"
- ],
- [
- "è¿Ľ",
- "åħ¥"
- ],
- [
- "è½",
- "®"
- ],
- [
- "Ġvol",
- "unte"
- ],
- [
- "Ġra",
- "in"
- ],
- [
- "ĠM",
- "on"
- ],
- [
- "Ġconsum",
- "ption"
- ],
- [
- "èĽĭ",
- "çϽ"
- ],
- [
- "ĠS",
- "oc"
- ],
- [
- "å£",
- "¤"
- ],
- [
- "Ġrout",
- "ine"
- ],
- [
- "Ġimpro",
- "ved"
- ],
- [
- "T",
- "o"
- ],
- [
- "人",
- "çī©"
- ],
- [
- "读",
- "èĢħ"
- ],
- [
- "Ġgo",
- "al"
- ],
- [
- "广",
- "åijĬ"
- ],
- [
- "éķ¿",
- "æľŁ"
- ],
- [
- "Ġe",
- "y"
- ],
- [
- "H",
- "e"
- ],
- [
- "Ġout",
- "do"
- ],
- [
- "Ġcu",
- "is"
- ],
- [
- "Ġa",
- "way"
- ],
- [
- "Ġbo",
- "oks"
- ],
- [
- "Ġtop",
- "ic"
- ],
- [
- "大",
- "åĪ©"
- ],
- [
- "h",
- "ouse"
- ],
- [
- "Ġon",
- "es"
- ],
- [
- "ç§",
- "Ł"
- ],
- [
- "'",
- ":"
- ],
- [
- "æĪ¿",
- "å±ĭ"
- ],
- [
- "ç§»",
- "åĬ¨"
- ],
- [
- "Ġdis",
- "asters"
- ],
- [
- "est",
- "s"
- ],
- [
- "ill",
- "ing"
- ],
- [
- "绿",
- "èī²"
- ],
- [
- "åĵ²",
- "åѦ"
- ],
- [
- "æĪIJ",
- "åĪĨ"
- ],
- [
- "Ġocc",
- "ur"
- ],
- [
- "ľ",
- "ä¼½"
- ],
- [
- "åľŁ",
- "壤"
- ],
- [
- "çļĦ",
- "主è¦ģ"
- ],
- [
- "çݰ",
- "å®ŀ"
- ],
- [
- "Ġanim",
- "al"
- ],
- [
- "é¢Ĩ",
- "导"
- ],
- [
- "Ġview",
- "s"
- ],
- [
- "éĤ",
- "®"
- ],
- [
- "æ°§",
- "åĮĸ"
- ],
- [
- "ath",
- "y"
- ],
- [
- "éģĵ",
- "å¾·"
- ],
- [
- "社交",
- "åªĴä½ĵ"
- ],
- [
- "ĠP",
- "ersonal"
- ],
- [
- "Ľ",
- "åĽ´"
- ],
- [
- "Ġpur",
- "ch"
- ],
- [
- "Ġcount",
- "ry"
- ],
- [
- "Ġrem",
- "ind"
- ],
- [
- "å¯",
- "¸"
- ],
- [
- "Ġr",
- "ights"
- ],
- [
- "çļĦ",
- "çݯå¢ĥ"
- ],
- [
- "ĠP",
- "r"
- ],
- [
- "Ġl",
- "ine"
- ],
- [
- "ib",
- "r"
- ],
- [
- "é©",
- "¾"
- ],
- [
- "Ġm",
- "aj"
- ],
- [
- "Ġover",
- "come"
- ],
- [
- "Ġne",
- "xt"
- ],
- [
- "æīĢ",
- "è¿°"
- ],
- [
- "è§Ħ",
- "å®ļ"
- ],
- [
- "Ġinteract",
- "ions"
- ],
- [
- "Ġconf",
- "lic"
- ],
- [
- "Ġwh",
- "y"
- ],
- [
- "ç³»",
- "åĪĹ"
- ],
- [
- "å°",
- "¼"
- ],
- [
- "ib",
- "ly"
- ],
- [
- "çīĽ",
- "奶"
- ],
- [
- "Ġrespons",
- "es"
- ],
- [
- "s",
- "es"
- ],
- [
- "åѦ",
- "ä¼ļ"
- ],
- [
- "b",
- "ol"
- ],
- [
- "Ġstand",
- "ards"
- ],
- [
- "ul",
- "ner"
- ],
- [
- "对è¯Ŀ",
- "åĨħ容"
- ],
- [
- "l",
- "ished"
- ],
- [
- "çļĦæĢ",
- "§"
- ],
- [
- "çĶŁæĢģ",
- "ç³»ç»Ł"
- ],
- [
- "an",
- "n"
- ],
- [
- "æĥħåĨµ",
- "ä¸ĭ"
- ],
- [
- "寻",
- "æ±Ĥ"
- ],
- [
- "Ġh",
- "old"
- ],
- [
- "d",
- "en"
- ],
- [
- "åį",
- "ĥ"
- ],
- [
- "Ġment",
- "ion"
- ],
- [
- "ĠMan",
- "y"
- ],
- [
- "缴",
- "åΰ"
- ],
- [
- "éģ",
- "Ĺ"
- ],
- [
- "he",
- "l"
- ],
- [
- "Ġbelie",
- "ve"
- ],
- [
- "ar",
- "ies"
- ],
- [
- "æľī",
- "ä¸Ģ个"
- ],
- [
- "1",
- "3"
- ],
- [
- "Ġatmosp",
- "here"
- ],
- [
- "Ġm",
- "or"
- ],
- [
- "æĹ¥",
- "æľŁ"
- ],
- [
- "ä¹",
- "ħ"
- ],
- [
- "ä½ł",
- "好"
- ],
- [
- "Ġaddress",
- "ing"
- ],
- [
- "ĠâĢ",
- "ĵ"
- ],
- [
- "çļĦåľ°",
- "æĸ¹"
- ],
- [
- "m",
- "ing"
- ],
- [
- "Ġcan",
- "not"
- ],
- [
- "Ġman",
- "ufact"
- ],
- [
- "Ġp",
- "ie"
- ],
- [
- "ic",
- "ing"
- ],
- [
- "Ġstud",
- "ies"
- ],
- [
- "ç¾İ",
- "åij³"
- ],
- [
- "ĠAmeric",
- "an"
- ],
- [
- "ĠN",
- "LP"
- ],
- [
- "Ġacc",
- "ording"
- ],
- [
- "ms",
- "elves"
- ],
- [
- "èĦ",
- "Ĥ"
- ],
- [
- "èĩª",
- "ä¿¡"
- ],
- [
- "æīĢ",
- "éľĢ"
- ],
- [
- "Ġthe",
- "mselves"
- ],
- [
- "Ġremot",
- "e"
- ],
- [
- "åŁ¹",
- "åħ»"
- ],
- [
- "å®ī",
- "æİĴ"
- ],
- [
- "ä½ł",
- "éľĢè¦ģ"
- ],
- [
- "Ġreg",
- "ard"
- ],
- [
- "ir",
- "ing"
- ],
- [
- "è¯Ĩ",
- "åĪ«"
- ],
- [
- "Ġart",
- "icle"
- ],
- [
- "æģ",
- "Ĵ"
- ],
- [
- "æĢ»",
- "çļĦæĿ¥"
- ],
- [
- "Ġal",
- "ign"
- ],
- [
- "æ±",
- "ł"
- ],
- [
- "ten",
- "ance"
- ],
- [
- "fact",
- "ion"
- ],
- [
- "åĬ¨",
- "ä½ľ"
- ],
- [
- "çļĦç",
- "©"
- ],
- [
- "ç¼",
- "©"
- ],
- [
- "æĢ",
- "¥"
- ],
- [
- "Ġ1",
- "00"
- ],
- [
- "Ġtest",
- "ing"
- ],
- [
- "åŃĹ",
- "æ¯į"
- ],
- [
- "å¹´",
- "è½»"
- ],
- [
- "åζ",
- "éĢł"
- ],
- [
- "Ġs",
- "we"
- ],
- [
- "å°",
- "º"
- ],
- [
- "he",
- "ns"
- ],
- [
- "æ°´",
- "æŀľ"
- ],
- [
- "Ġinf",
- "rastructure"
- ],
- [
- "èī²",
- "彩"
- ],
- [
- "æĢ»çļĦæĿ¥",
- "说"
- ],
- [
- "æľī",
- "ä»Ģä¹Ī"
- ],
- [
- "te",
- "xt"
- ],
- [
- "车",
- "è¾Ĩ"
- ],
- [
- "Ġp",
- "ay"
- ],
- [
- "ro",
- "p"
- ],
- [
- "Ċ",
- "ĠĠ"
- ],
- [
- "Ġcaus",
- "ed"
- ],
- [
- "Ġcor",
- "rect"
- ],
- [
- "Ġ",
- "ì"
- ],
- [
- "èĥ",
- "ŀ"
- ],
- [
- "ĠM",
- "ed"
- ],
- [
- "ç²¾",
- "ç¥ŀ"
- ],
- [
- "æ°ĶåĢĻ",
- "åıĺåĮĸ"
- ],
- [
- "ĠR",
- "ed"
- ],
- [
- "äºĴ",
- "èģĶç½ij"
- ],
- [
- "Ġeng",
- "age"
- ],
- [
- "åĪĨ",
- "为"
- ],
- [
- "ĠD",
- "ata"
- ],
- [
- "Ġful",
- "l"
- ],
- [
- "en",
- "c"
- ],
- [
- "éĩį",
- "æĸ°"
- ],
- [
- "æŃ£ç¡®",
- "çļĦ"
- ],
- [
- "çļĦæ°",
- "Ķ"
- ],
- [
- "åıĮ",
- "æĸ¹"
- ],
- [
- "Ġcom",
- "es"
- ],
- [
- "åı¤",
- "代"
- ],
- [
- "æŁIJ",
- "äºĽ"
- ],
- [
- "åijĪ",
- "çݰ"
- ],
- [
- "Ġto",
- "day"
- ],
- [
- "ag",
- "ed"
- ],
- [
- "æĪij",
- "åı¯ä»¥"
- ],
- [
- "æĹ¥",
- "常"
- ],
- [
- "æ»",
- "ij"
- ],
- [
- "Ġcl",
- "in"
- ],
- [
- "Ġ",
- "\\"
- ],
- [
- "Ġo",
- "bs"
- ],
- [
- "Ġart",
- "ificial"
- ],
- [
- "Ġexce",
- "ll"
- ],
- [
- "çļĦç",
- "¬"
- ],
- [
- "all",
- "s"
- ],
- [
- "Ġprodu",
- "ce"
- ],
- [
- "ĠD",
- "es"
- ],
- [
- "os",
- "s"
- ],
- [
- "è¹",
- "Ī"
- ],
- [
- "Ġdra",
- "w"
- ],
- [
- "Ġlet",
- "ter"
- ],
- [
- "Ġadv",
- "ice"
- ],
- [
- "Ġhigh",
- "ly"
- ],
- [
- "çĬ",
- "¯"
- ],
- [
- "综ä¸Ĭ",
- "æīĢè¿°"
- ],
- [
- "满",
- "æĦı"
- ],
- [
- "Ġprinci",
- "ples"
- ],
- [
- "èĮ",
- "Ħ"
- ],
- [
- "Ġfeel",
- "ings"
- ],
- [
- "çļĦæ",
- "´"
- ],
- [
- "Ġh",
- "om"
- ],
- [
- "Ġf",
- "ail"
- ],
- [
- "Ġcro",
- "p"
- ],
- [
- "å§",
- "ľ"
- ],
- [
- "Ġquest",
- "ion"
- ],
- [
- "Ġdis",
- "abilities"
- ],
- [
- "èĪŀ",
- "è¹Ī"
- ],
- [
- "Ġimp",
- "lications"
- ],
- [
- "r",
- "al"
- ],
- [
- "Ġs",
- "ing"
- ],
- [
- "4",
- "0"
- ],
- [
- "Ġfam",
- "il"
- ],
- [
- "Ġgovern",
- "ments"
- ],
- [
- "Ġrec",
- "ord"
- ],
- [
- "å½¢",
- "çĬ¶"
- ],
- [
- "Ġbe",
- "gin"
- ],
- [
- "is",
- "es"
- ],
- [
- "çļĦæĥ",
- "³"
- ],
- [
- "ach",
- "ine"
- ],
- [
- "è°",
- "±"
- ],
- [
- "Ġv",
- "ulner"
- ],
- [
- "Ġpro",
- "per"
- ],
- [
- "Ġovers",
- "ight"
- ],
- [
- "è´Ł",
- "éĿ¢"
- ],
- [
- "Ġem",
- "ail"
- ],
- [
- "Ġnew",
- "s"
- ],
- [
- "Ġexpl",
- "oring"
- ],
- [
- "Ġf",
- "avor"
- ],
- [
- "æ¥",
- "¼"
- ],
- [
- "å®",
- "ľ"
- ],
- [
- "Ġun",
- "ivers"
- ],
- [
- "å·®",
- "å¼Ĥ"
- ],
- [
- "ï¼ī",
- "ãĢĤ"
- ],
- [
- "è§£åĨ³",
- "éĹ®é¢ĺ"
- ],
- [
- "Ġfam",
- "ous"
- ],
- [
- "g",
- "n"
- ],
- [
- "Ġmess",
- "age"
- ],
- [
- "at",
- "itude"
- ],
- [
- "Ġc",
- "ra"
- ],
- [
- "Ġco",
- "ver"
- ],
- [
- "æ·±",
- "åĪ»"
- ],
- [
- "åı¯ä»¥",
- "éĢīæĭ©"
- ],
- [
- "çĶŁæ´»",
- "ä¸Ń"
- ],
- [
- "ç§į",
- "ç±»"
- ],
- [
- "Ġsm",
- "art"
- ],
- [
- "on",
- "str"
- ],
- [
- "ve",
- "y"
- ],
- [
- "çĶ",
- "²"
- ],
- [
- "Ġreg",
- "ularly"
- ],
- [
- "ĠS",
- "m"
- ],
- [
- "æĦŁ",
- "è§ī"
- ],
- [
- "Ġthough",
- "t"
- ],
- [
- "Ġex",
- "h"
- ],
- [
- "c",
- "ure"
- ],
- [
- "ç»",
- "ĺ"
- ],
- [
- "认",
- "è¯Ĩ"
- ],
- [
- "Ġo",
- "ld"
- ],
- [
- "æĦ",
- "ī"
- ],
- [
- "ç§°",
- "为"
- ],
- [
- "Ġfiel",
- "ds"
- ],
- [
- "Ġcons",
- "ist"
- ],
- [
- "ã",
- "ģ"
- ],
- [
- "ç»Ĩ",
- "èĥŀ"
- ],
- [
- "Ġh",
- "ours"
- ],
- [
- "8",
- "0"
- ],
- [
- "al",
- "king"
- ],
- [
- "è§ī",
- "å¾Ĺ"
- ],
- [
- "ç»",
- "Ŀ"
- ],
- [
- "ä½ł",
- "们"
- ],
- [
- "ĠEng",
- "lish"
- ],
- [
- "Ġsignificant",
- "ly"
- ],
- [
- "Ġs",
- "ource"
- ],
- [
- "Ġan",
- "t"
- ],
- [
- "Ġeducation",
- "al"
- ],
- [
- "Ġtas",
- "k"
- ],
- [
- "Ġhand",
- "le"
- ],
- [
- "æIJ",
- "ľ"
- ],
- [
- "ĠS",
- "p"
- ],
- [
- "Ġcall",
- "ed"
- ],
- [
- "Ġter",
- "ms"
- ],
- [
- "æ²",
- "ī"
- ],
- [
- "Ġw",
- "in"
- ],
- [
- "duct",
- "ion"
- ],
- [
- "Ġmod",
- "ern"
- ],
- [
- "Ġcuis",
- "ine"
- ],
- [
- "å¥",
- "Ĺ"
- ],
- [
- "è§",
- "¦"
- ],
- [
- "olut",
- "ely"
- ],
- [
- "ç«",
- "¥"
- ],
- [
- "p",
- "ite"
- ],
- [
- "Ġf",
- "elt"
- ],
- [
- "Ġcomp",
- "re"
- ],
- [
- "Ġw",
- "ond"
- ],
- [
- "è¿IJ",
- "è¡Į"
- ],
- [
- "Ġres",
- "il"
- ],
- [
- "缸",
- "ä¼¼"
- ],
- [
- "éĩij",
- "èŀį"
- ],
- [
- "çα",
- "æĥħ"
- ],
- [
- "ç¬",
- "Ķ"
- ],
- [
- "èĪ",
- "ª"
- ],
- [
- "è°",
- "Ī"
- ],
- [
- "åĬĽ",
- "çļĦ"
- ],
- [
- "æľī",
- "æīĢ"
- ],
- [
- "æ½",
- "ľ"
- ],
- [
- "ul",
- "ate"
- ],
- [
- "Ġdetect",
- "ion"
- ],
- [
- "宣",
- "ä¼ł"
- ],
- [
- "Ġmat",
- "ter"
- ],
- [
- "éĩı",
- "åŃIJ"
- ],
- [
- "W",
- "rite"
- ],
- [
- "ç»ĵ",
- "åIJĪ"
- ],
- [
- "ç»ı",
- "è¿ĩ"
- ],
- [
- "Ġdevelop",
- "ers"
- ],
- [
- "è",
- "ª"
- ],
- [
- "Ġ",
- "---"
- ],
- [
- "人",
- "éĻħ"
- ],
- [
- "çŃ",
- "¾"
- ],
- [
- "ï¼ļ",
- "âĢľ"
- ],
- [
- "Ġinnov",
- "ative"
- ],
- [
- "ãĢĤ",
- "âĢĿ"
- ],
- [
- "å½",
- "¼"
- ],
- [
- "é¥",
- "¼"
- ],
- [
- "è¿ĩ",
- "度"
- ],
- [
- "Ġplan",
- "et"
- ],
- [
- "åħ",
- "°"
- ],
- [
- "å¸",
- "ģ"
- ],
- [
- "æķ",
- "¬"
- ],
- [
- "Ġleg",
- "al"
- ],
- [
- "Ġlo",
- "t"
- ],
- [
- "æĪIJ为",
- "äºĨ"
- ],
- [
- "i",
- "ate"
- ],
- [
- "Ġm",
- "is"
- ],
- [
- "åģĩ",
- "设"
- ],
- [
- "çļĦ",
- "æĸĩ竳"
- ],
- [
- "ĠCom",
- "pan"
- ],
- [
- "Ġd",
- "oc"
- ],
- [
- "Ġcare",
- "ful"
- ],
- [
- "Ġe",
- "ver"
- ],
- [
- "æĪij们",
- "å°Ĩ"
- ],
- [
- "ä¾ĭ",
- "åŃIJ"
- ],
- [
- "ä¹",
- "³"
- ],
- [
- "ä½ľ",
- "èĢħ"
- ],
- [
- "åIJ",
- "§"
- ],
- [
- "æļ",
- "´"
- ],
- [
- "Ġrem",
- "ember"
- ],
- [
- "缮",
- "çļĦ"
- ],
- [
- "Ġp",
- "ut"
- ],
- [
- "常è§ģ",
- "çļĦ"
- ],
- [
- "Ġf",
- "est"
- ],
- [
- "建",
- "设"
- ],
- [
- "å®ŀ",
- "ç͍"
- ],
- [
- "Ġact",
- "ive"
- ],
- [
- "çª",
- "Ĺ"
- ],
- [
- "ou",
- "th"
- ],
- [
- "åİŁ",
- "çIJĨ"
- ],
- [
- "Ġtry",
- "ing"
- ],
- [
- "è¿",
- "·"
- ],
- [
- "缸",
- "åIJĮ"
- ],
- [
- "éħĴ",
- "åºĹ"
- ],
- [
- "An",
- "other"
- ],
- [
- "æľĢ",
- "ä½³"
- ],
- [
- "Ġanaly",
- "tics"
- ],
- [
- "Ġper",
- "pet"
- ],
- [
- "ip",
- "ment"
- ],
- [
- "Ġ",
- "å¦Ĥæŀľ"
- ],
- [
- "è§Ĥ",
- "ä¼Ĺ"
- ],
- [
- "Ġc",
- "elebr"
- ],
- [
- "Ġhe",
- "av"
- ],
- [
- "Ġmed",
- "itation"
- ],
- [
- "大",
- "æ°Ķ"
- ],
- [
- "A",
- "nd"
- ],
- [
- "ä¸į",
- "éĶĻ"
- ],
- [
- "Ġwhe",
- "ther"
- ],
- [
- "s",
- "et"
- ],
- [
- "Ġdem",
- "onstr"
- ],
- [
- "ä¸Ģ",
- "款"
- ],
- [
- "æĶ¶",
- "éĽĨ"
- ],
- [
- "éĻIJ",
- "åζ"
- ],
- [
- "Ġ",
- "ing"
- ],
- [
- "Ġrev",
- "olution"
- ],
- [
- "çľ",
- "ģ"
- ],
- [
- "Ġsc",
- "ience"
- ],
- [
- "缮",
- "åīį"
- ],
- [
- "Ġthink",
- "ing"
- ],
- [
- "±",
- "ä¹IJ"
- ],
- [
- "课",
- "ç¨ĭ"
- ],
- [
- "Ġp",
- "ack"
- ],
- [
- "Ġim",
- "age"
- ],
- [
- "lo",
- "c"
- ],
- [
- "Ġst",
- "ories"
- ],
- [
- "uc",
- "k"
- ],
- [
- "Ġsatis",
- "faction"
- ],
- [
- "Ġcollect",
- "ion"
- ],
- [
- "h",
- "o"
- ],
- [
- "èµ",
- "ŀ"
- ],
- [
- "éĿ¢",
- "临"
- ],
- [
- "Ġl",
- "a"
- ],
- [
- "Ġsym",
- "bol"
- ],
- [
- "Ġem",
- "b"
- ],
- [
- "Ġhabit",
- "ats"
- ],
- [
- "Ġlow",
- "er"
- ],
- [
- "Ġcontin",
- "ues"
- ],
- [
- "éľ",
- "ĩ"
- ],
- [
- "åĵ",
- "Ī"
- ],
- [
- "ĠT",
- "ake"
- ],
- [
- "Ġenviron",
- "ments"
- ],
- [
- "Ġth",
- "ree"
- ],
- [
- "Ġen",
- "c"
- ],
- [
- "ĠA",
- "cc"
- ],
- [
- "æĦı",
- "åij³"
- ],
- [
- "åİ",
- "¨"
- ],
- [
- "ch",
- "an"
- ],
- [
- "ĠH",
- "um"
- ],
- [
- "Ġtr",
- "ue"
- ],
- [
- "åĪĩ",
- "æĪIJ"
- ],
- [
- "s",
- "ing"
- ],
- [
- "âĢĶ",
- "âĢĶ"
- ],
- [
- "åĩº",
- "æĿ¥"
- ],
- [
- "Ġreg",
- "ion"
- ],
- [
- "Ġinter",
- "pre"
- ],
- [
- "Ġdiagnos",
- "is"
- ],
- [
- "é",
- "ŀ"
- ],
- [
- "Ġdo",
- "ing"
- ],
- [
- "Ġr",
- "un"
- ],
- [
- "Ġco",
- "ffee"
- ],
- [
- "Ġmaj",
- "or"
- ],
- [
- "Ġmindful",
- "ness"
- ],
- [
- "Ġafford",
- "able"
- ],
- [
- "çĻ",
- "¾"
- ],
- [
- "Ġdetail",
- "ed"
- ],
- [
- "éĿŀ常",
- "éĩįè¦ģçļĦ"
- ],
- [
- "çļĦæ²",
- "ŁéĢļ"
- ],
- [
- "çļĦæķ",
- "ħ"
- ],
- [
- "åĢĴ",
- "åħ¥"
- ],
- [
- "Ġthem",
- "es"
- ],
- [
- "Ġnet",
- "work"
- ],
- [
- "ï¼ī",
- "ï¼ļ"
- ],
- [
- "ĠUn",
- "ited"
- ],
- [
- "çļĦæĮ",
- "ĩ"
- ],
- [
- "ort",
- "s"
- ],
- [
- "åį«",
- "çĶŁ"
- ],
- [
- "Ġplan",
- "ning"
- ],
- [
- "æĥ",
- "ł"
- ],
- [
- "åī",
- "ª"
- ],
- [
- "ĠPro",
- "v"
- ],
- [
- "çļĦ",
- "åºĶç͍"
- ],
- [
- "Ġp",
- "eri"
- ],
- [
- "Ġaccount",
- "able"
- ],
- [
- "çī",
- "Ļ"
- ],
- [
- "çļĦç",
- "ģ"
- ],
- [
- "Ġcho",
- "ice"
- ],
- [
- "ĠC",
- "omm"
- ],
- [
- "id",
- "ents"
- ],
- [
- "çļĦ",
- "å®īåħ¨"
- ],
- [
- "å¹¶",
- "ä¸į"
- ],
- [
- "太éĺ³",
- "ç³»"
- ],
- [
- "Ġrece",
- "ive"
- ],
- [
- "Ġclo",
- "se"
- ],
- [
- "çļĦæĹ¶",
- "åĢĻ"
- ],
- [
- "Ġchang",
- "ing"
- ],
- [
- "ä»·å̼",
- "è§Ĥ"
- ],
- [
- "Ġperpet",
- "u"
- ],
- [
- "Ġse",
- "ason"
- ],
- [
- "Ġm",
- "en"
- ],
- [
- "Ġlearn",
- "ed"
- ],
- [
- "Ġsitu",
- "ation"
- ],
- [
- "Ġre",
- "place"
- ],
- [
- "he",
- "ad"
- ],
- [
- "让",
- "æĪij"
- ],
- [
- "åľ¨",
- "ä¸Ģèµ·"
- ],
- [
- "çļĦç©",
- "º"
- ],
- [
- "éľ",
- "²"
- ],
- [
- "Ġen",
- "ough"
- ],
- [
- "å±ķ",
- "çݰ"
- ],
- [
- "Ġlead",
- "ers"
- ],
- [
- "an",
- "cing"
- ],
- [
- "Ġtemper",
- "ature"
- ],
- [
- "åı",
- "«"
- ],
- [
- "Ġ3",
- "0"
- ],
- [
- "æĦıåij³",
- "çĿĢ"
- ],
- [
- "æ±",
- "ĩ"
- ],
- [
- "ĠGo",
- "vern"
- ],
- [
- "Ġfocus",
- "ed"
- ],
- [
- "u",
- "ro"
- ],
- [
- "Ġsim",
- "ple"
- ],
- [
- "Ġh",
- "iking"
- ],
- [
- "æ¯",
- "Ĵ"
- ],
- [
- "Ġcompre",
- "hens"
- ],
- [
- "äº",
- "Ī"
- ],
- [
- "Ġcreat",
- "ed"
- ],
- [
- "con",
- "d"
- ],
- [
- "é¡",
- "µ"
- ],
- [
- "ĠW",
- "or"
- ],
- [
- "è¯ģ",
- "æį®"
- ],
- [
- "Ġwork",
- "place"
- ],
- [
- "Ġcharact",
- "ers"
- ],
- [
- "çļĦ",
- "设计"
- ],
- [
- "Ġme",
- "chan"
- ],
- [
- "ĠD",
- "is"
- ],
- [
- "ç¥ŀ",
- "ç§ĺ"
- ],
- [
- "å·",
- "ŀ"
- ],
- [
- "ĠO",
- "n"
- ],
- [
- "<",
- "/"
- ],
- [
- "ç§į",
- "æ¤į"
- ],
- [
- "Ġpat",
- "h"
- ],
- [
- "Ġlim",
- "ited"
- ],
- [
- "Ġsol",
- "ar"
- ],
- [
- "çļĦæ",
- "ı"
- ],
- [
- "2",
- "2"
- ],
- [
- "Ġappreci",
- "ate"
- ],
- [
- "å¿«",
- "ä¹IJ"
- ],
- [
- "æĦŁ",
- "åıĹåΰ"
- ],
- [
- "èĢ",
- "Ĺ"
- ],
- [
- "m",
- "ed"
- ],
- [
- "ic",
- "ine"
- ],
- [
- "Ġnot",
- "e"
- ],
- [
- "å½ĵ",
- "åīį"
- ],
- [
- "æĪij们",
- "åºĶ该"
- ],
- [
- "Ġse",
- "en"
- ],
- [
- "ä¸Ģ",
- "åIJį"
- ],
- [
- "å°½",
- "åı¯èĥ½"
- ],
- [
- "è¿IJ",
- "ç®Ĺ"
- ],
- [
- "è§Ĵ",
- "度"
- ],
- [
- "Ġequ",
- "ipment"
- ],
- [
- "Ġsp",
- "read"
- ],
- [
- "è",
- "¸"
- ],
- [
- "è®",
- "¿"
- ],
- [
- "åı¥",
- "è¯Ŀ"
- ],
- [
- "æĮ",
- "¥"
- ],
- [
- "Ġpur",
- "pose"
- ],
- [
- "请",
- "ä½ł"
- ],
- [
- "Y",
- "our"
- ],
- [
- "ari",
- "an"
- ],
- [
- "ä»",
- "ª"
- ],
- [
- "Ġperspect",
- "ives"
- ],
- [
- "åĩº",
- "äºĨ"
- ],
- [
- "å©ļ",
- "礼"
- ],
- [
- "Ġexcell",
- "ent"
- ],
- [
- "ĠEns",
- "uring"
- ],
- [
- "Ġre",
- "ach"
- ],
- [
- "éĺ¶",
- "段"
- ],
- [
- "ä¿Ŀ",
- "éļľ"
- ],
- [
- "Ġemp",
- "athy"
- ],
- [
- "ĠM",
- "y"
- ],
- [
- "çij",
- "ľä¼½"
- ],
- [
- "Ġ",
- "ver"
- ],
- [
- "ab",
- "el"
- ],
- [
- "ĠPre",
- "dict"
- ],
- [
- "Ġmain",
- "tenance"
- ],
- [
- "è¯Ħ",
- "ä»·"
- ],
- [
- "Ġ",
- "ult"
- ],
- [
- "åĴ",
- "¨"
- ],
- [
- "o",
- "x"
- ],
- [
- "åĴ¨",
- "询"
- ],
- [
- "Ġshare",
- "d"
- ],
- [
- "in",
- "a"
- ],
- [
- "l",
- "ist"
- ],
- [
- "Ġoutdo",
- "or"
- ],
- [
- "Ġthough",
- "ts"
- ],
- [
- "in",
- "ating"
- ],
- [
- "éĴ",
- "±"
- ],
- [
- "Ġfra",
- "me"
- ],
- [
- "éĺ",
- "¿"
- ],
- [
- "åĪ©",
- "润"
- ],
- [
- "çļĦæİ",
- "¨"
- ],
- [
- "åį",
- "ļ"
- ],
- [
- "Ġrec",
- "ent"
- ],
- [
- "Ġal",
- "tern"
- ],
- [
- "are",
- "d"
- ],
- [
- "=",
- "="
- ],
- [
- "Ġro",
- "ad"
- ],
- [
- "äºĭ",
- "项"
- ],
- [
- "g",
- "ed"
- ],
- [
- "y",
- "nt"
- ],
- [
- "Ġspe",
- "nd"
- ],
- [
- "ç½",
- "ª"
- ],
- [
- "åıĸ",
- "å¾Ĺ"
- ],
- [
- "é",
- "¹"
- ],
- [
- "l",
- "i"
- ],
- [
- "æĹ¶",
- "æľŁ"
- ],
- [
- "严",
- "éĩį"
- ],
- [
- "å¿",
- "Ĩ"
- ],
- [
- "å©",
- "´"
- ],
- [
- "æİ¥",
- "ä¸ĭæĿ¥"
- ],
- [
- "ĠEar",
- "th"
- ],
- [
- "ĠChat",
- "bots"
- ],
- [
- "Ġset",
- "ting"
- ],
- [
- "ç¥",
- "Ŀ"
- ],
- [
- "éĶĢåĶ®",
- "é¢Ŀ"
- ],
- [
- "ä¼",
- "¦"
- ],
- [
- "Ġread",
- "ing"
- ],
- [
- "æİ¢",
- "讨"
- ],
- [
- "a",
- "ign"
- ],
- [
- "éŀ",
- "ĭ"
- ],
- [
- "Ġyou",
- "ng"
- ],
- [
- "Ġcare",
- "er"
- ],
- [
- "Ġteac",
- "hers"
- ],
- [
- "çļĦ",
- "è´¨éĩı"
- ],
- [
- "å±ŀ",
- "äºİ"
- ],
- [
- "Ġeas",
- "ier"
- ],
- [
- "Ġscient",
- "ific"
- ],
- [
- "ç¾İ",
- "åħĥ"
- ],
- [
- "Ġsp",
- "ir"
- ],
- [
- "åĬ",
- "³"
- ],
- [
- "çļĦæĶ",
- "¯"
- ],
- [
- "r",
- "ist"
- ],
- [
- "èµĦ",
- "产"
- ],
- [
- "çĶŁ",
- "åŃĺ"
- ],
- [
- "èĩ³",
- "å°ij"
- ],
- [
- "å§",
- "¿"
- ],
- [
- "Ġvide",
- "o"
- ],
- [
- "Ġa",
- "im"
- ],
- [
- "å®Ŀ",
- "å®Ŀ"
- ],
- [
- "çζ",
- "æ¯į"
- ],
- [
- "________",
- "________"
- ],
- [
- "al",
- "ities"
- ],
- [
- "Ġb",
- "ud"
- ],
- [
- "Ġstre",
- "et"
- ],
- [
- "Ġ",
- "æĺ¯"
- ],
- [
- "æĸ¹",
- "ç¨ĭ"
- ],
- [
- "ä¸ĸ",
- "纪"
- ],
- [
- "c",
- "hes"
- ],
- [
- "ear",
- "ch"
- ],
- [
- "æĴ",
- "°"
- ],
- [
- "Ġeng",
- "ine"
- ],
- [
- "Ġdis",
- "placement"
- ],
- [
- "ĠRo",
- "bots"
- ],
- [
- "erv",
- "ised"
- ],
- [
- "é¡",
- "¶"
- ],
- [
- "ou",
- "d"
- ],
- [
- "Ġw",
- "alk"
- ],
- [
- "Ġemerg",
- "ency"
- ],
- [
- "èģ",
- "ĺ"
- ],
- [
- "n",
- "al"
- ],
- [
- "Ġdat",
- "as"
- ],
- [
- "åĢ",
- "º"
- ],
- [
- "åIJİ",
- "çļĦ"
- ],
- [
- "å¾Ī",
- "好"
- ],
- [
- "Ġmy",
- "self"
- ],
- [
- "çļĦæī",
- "ĭ"
- ],
- [
- "Ġus",
- "age"
- ],
- [
- "Ġsh",
- "own"
- ],
- [
- "æ®",
- "Ĭ"
- ],
- [
- "Ġtyp",
- "ically"
- ],
- [
- "u",
- "ly"
- ],
- [
- "æĸ°",
- "éĹ»"
- ],
- [
- "æĽ",
- "¿"
- ],
- [
- "Ġor",
- "ig"
- ],
- [
- "è½»",
- "æĿ¾"
- ],
- [
- "æĺ¾",
- "示"
- ],
- [
- "Ġado",
- "pt"
- ],
- [
- "èĤ¡",
- "票"
- ],
- [
- "Ġp",
- "arent"
- ],
- [
- "a",
- "ps"
- ],
- [
- "æĢĿ",
- "æĥ³"
- ],
- [
- "Ġmarket",
- "ing"
- ],
- [
- "èĻ",
- "«"
- ],
- [
- "éĥ¨",
- "éŨ"
- ],
- [
- "çļĦæķ",
- "Ī"
- ],
- [
- "Ġcomfort",
- "able"
- ],
- [
- "åŃ¦ä¹ł",
- "åĴĮ"
- ],
- [
- "Ġfore",
- "cast"
- ],
- [
- "ict",
- "ion"
- ],
- [
- "Ġget",
- "ting"
- ],
- [
- "Ġtre",
- "es"
- ],
- [
- "av",
- "ing"
- ],
- [
- "çļĦ",
- "åŁºç¡Ģ"
- ],
- [
- "read",
- "y"
- ],
- [
- "æĸ°",
- "é²ľ"
- ],
- [
- "go",
- "ing"
- ],
- [
- "¹",
- "é¥"
- ],
- [
- "Ġev",
- "idence"
- ],
- [
- "¹é¥",
- "ª"
- ],
- [
- "ç§",
- "ĭ"
- ],
- [
- "æľī",
- "å¾Īå¤ļ"
- ],
- [
- "éĿ¢",
- "è¯ķ"
- ],
- [
- "éģĩ",
- "åΰ"
- ],
- [
- "ç»Ļ",
- "å®ļ"
- ],
- [
- "ir",
- "c"
- ],
- [
- "åı¯ä»¥",
- "æł¹æį®"
- ],
- [
- "驾",
- "é©¶"
- ],
- [
- "å·§",
- "åħĭ"
- ],
- [
- "Ġst",
- "unning"
- ],
- [
- "çļĦæ",
- "¦Ĥ"
- ],
- [
- "æ¡",
- "Į"
- ],
- [
- "ĠJ",
- "ohn"
- ],
- [
- "ul",
- "ation"
- ],
- [
- "åıĤ",
- "èĢĥ"
- ],
- [
- "Ġf",
- "lex"
- ],
- [
- "çĦ¦",
- "èĻij"
- ],
- [
- "ym",
- "akers"
- ],
- [
- "Ġfor",
- "ms"
- ],
- [
- "s",
- "h"
- ],
- [
- "v",
- "al"
- ],
- [
- "ĠS",
- "o"
- ],
- [
- "c",
- "o"
- ],
- [
- "æİ¨",
- "åĬ¨"
- ],
- [
- "èħ",
- "¿"
- ],
- [
- "çī¹",
- "æ®Ĭ"
- ],
- [
- "Ġen",
- "ab"
- ],
- [
- "å°Ĩ",
- "ä¼ļ"
- ],
- [
- "æĶ¯",
- "åĩº"
- ],
- [
- "åĿļ",
- "æĮģ"
- ],
- [
- "红",
- "èī²"
- ],
- [
- "Ġopt",
- "ion"
- ],
- [
- "Ġstart",
- "ed"
- ],
- [
- "r",
- "ation"
- ],
- [
- "Ġpo",
- "etry"
- ],
- [
- "Ġp",
- "ort"
- ],
- [
- "g",
- "en"
- ],
- [
- "èª",
- "ī"
- ],
- [
- "Ġdel",
- "iv"
- ],
- [
- "çĶ",
- "ļ"
- ],
- [
- "éĢ",
- "»"
- ],
- [
- "éĢī",
- "项"
- ],
- [
- "Ġg",
- "round"
- ],
- [
- "å½¼",
- "æŃ¤"
- ],
- [
- "an",
- "a"
- ],
- [
- "çļĦæĹ",
- "¥"
- ],
- [
- "åľ¨",
- "线"
- ],
- [
- "Ġse",
- "cure"
- ],
- [
- "Ġ",
- "æł¹æį®"
- ],
- [
- "饮",
- "æĸĻ"
- ],
- [
- "Ġgr",
- "atitude"
- ],
- [
- "第",
- "ä¸ī"
- ],
- [
- "Ġs",
- "ong"
- ],
- [
- "Ġpoint",
- "s"
- ],
- [
- "Ġal",
- "ready"
- ],
- [
- "çļĦçĪ",
- "±"
- ],
- [
- "ĠTe",
- "chn"
- ],
- [
- "Ġreal",
- "ity"
- ],
- [
- "çı",
- "Ń"
- ],
- [
- "Ġs",
- "ince"
- ],
- [
- "Ġpopul",
- "ation"
- ],
- [
- "y",
- "ond"
- ],
- [
- "b",
- "or"
- ],
- [
- "ĠSoc",
- "ial"
- ],
- [
- "æıIJ",
- "åıĸ"
- ],
- [
- "å·¥",
- "ç¨ĭ"
- ],
- [
- "a",
- "ff"
- ],
- [
- "交",
- "æĺĵ"
- ],
- [
- "Ġwor",
- "th"
- ],
- [
- "å¡",
- "«"
- ],
- [
- "å¨",
- "±ä¹IJ"
- ],
- [
- "Ġdo",
- "g"
- ],
- [
- "ĠAr",
- "t"
- ],
- [
- "ç¡",
- "¬"
- ],
- [
- "æµ·",
- "æ´ĭ"
- ],
- [
- "åĨ",
- "Ĵ"
- ],
- [
- "çī",
- "Ī"
- ],
- [
- "Ġprogramm",
- "ing"
- ],
- [
- "ĠAs",
- "s"
- ],
- [
- "ĠM",
- "achine"
- ],
- [
- "å̼",
- "å¾Ĺ"
- ],
- [
- "请",
- "è¾ĵåħ¥"
- ],
- [
- "声",
- "éŁ³"
- ],
- [
- "Ġexercis",
- "es"
- ],
- [
- "åħī",
- "线"
- ],
- [
- "æ³ķ",
- "åĴĮ"
- ],
- [
- "Ġfeat",
- "ure"
- ],
- [
- "e",
- "ff"
- ],
- [
- "è¿Ľ",
- "æŃ¥"
- ],
- [
- "女",
- "æĢ§"
- ],
- [
- "Ġefficient",
- "ly"
- ],
- [
- "çļĦæĬĢ",
- "æľ¯"
- ],
- [
- "Ġgen",
- "etic"
- ],
- [
- "令",
- "人"
- ],
- [
- "è´",
- "¦"
- ],
- [
- "çļĦ",
- "产åĵģ"
- ],
- [
- "åİ",
- "ļ"
- ],
- [
- "åĴĮ",
- "æĸĩåĮĸ"
- ],
- [
- "éĻ",
- "Ħ"
- ],
- [
- "Ġmo",
- "b"
- ],
- [
- "综",
- "åIJĪ"
- ],
- [
- "t",
- "ers"
- ],
- [
- "æľī",
- "ä¸Ģ"
- ],
- [
- "å¦",
- "Ĩ"
- ],
- [
- "åį",
- "Ī"
- ],
- [
- "Ġout",
- "side"
- ],
- [
- "Ġprop",
- "ert"
- ],
- [
- "éĤ®",
- "ä»¶"
- ],
- [
- "主",
- "ä¹ī"
- ],
- [
- "Ġpolic",
- "y"
- ],
- [
- "èĩª",
- "身"
- ],
- [
- "Ġnav",
- "igate"
- ],
- [
- "Ġst",
- "y"
- ],
- [
- "ç͵",
- "èĦij"
- ],
- [
- "Ġab",
- "ilities"
- ],
- [
- "Ġfac",
- "ed"
- ],
- [
- "çļĦç",
- "¼"
- ],
- [
- "çļĦ",
- "å°ı"
- ],
- [
- "è",
- "ķ"
- ],
- [
- "Ġt",
- "one"
- ],
- [
- "ig",
- "ation"
- ],
- [
- "åıĤ",
- "æķ°"
- ],
- [
- "èĽĭçϽ",
- "è´¨"
- ],
- [
- "ä½",
- "Ľ"
- ],
- [
- "çĶļ",
- "èĩ³"
- ],
- [
- "Ġsk",
- "in"
- ],
- [
- "èĴ",
- "¸"
- ],
- [
- "æĭ",
- "Ľ"
- ],
- [
- "éŃ",
- "Ķ"
- ],
- [
- "ash",
- "ion"
- ],
- [
- "Ġing",
- "red"
- ],
- [
- "æĹ",
- "ĭ"
- ],
- [
- "Ġcamp",
- "aign"
- ],
- [
- "Ġm",
- "ount"
- ],
- [
- "Ġcons",
- "id"
- ],
- [
- "Ġmus",
- "e"
- ],
- [
- "n",
- "ter"
- ],
- [
- "w",
- "ater"
- ],
- [
- "ä¼ļ",
- "è®®"
- ],
- [
- "Ġprotect",
- "ion"
- ],
- [
- "ä¿Ŀ",
- "éĻ©"
- ],
- [
- "Ġcro",
- "ps"
- ],
- [
- "og",
- "le"
- ],
- [
- "éļı",
- "æĹ¶"
- ],
- [
- "æļ",
- "Ĺ"
- ],
- [
- "i",
- "um"
- ],
- [
- "ä¹",
- "ı"
- ],
- [
- "Ġdi",
- "et"
- ],
- [
- "l",
- "ies"
- ],
- [
- "ç͍",
- "æĿ¥"
- ],
- [
- "ĠEn",
- "coura"
- ],
- [
- "æĬ",
- "Ĺ"
- ],
- [
- "ap",
- "an"
- ],
- [
- "éĺ²",
- "æŃ¢"
- ],
- [
- "W",
- "ow"
- ],
- [
- "çļĦ",
- "åŁºæľ¬"
- ],
- [
- "å¹³",
- "æĸ¹"
- ],
- [
- "Ġst",
- "ep"
- ],
- [
- "åı¯",
- "éĿł"
- ],
- [
- "表",
- "æĺİ"
- ],
- [
- "Ġpredict",
- "ions"
- ],
- [
- "Ġsym",
- "pt"
- ],
- [
- "Ġdiagnos",
- "es"
- ],
- [
- "åħ¬",
- "åĽŃ"
- ],
- [
- "Ġsupp",
- "ly"
- ],
- [
- "Ġprev",
- "ious"
- ],
- [
- "ç»Ħ",
- "åIJĪ"
- ],
- [
- ".",
- ","
- ],
- [
- "çļĦ",
- "è¿ĩç¨ĭ"
- ],
- [
- "æķ",
- "ı"
- ],
- [
- "s",
- "u"
- ],
- [
- "ar",
- "is"
- ],
- [
- "çķ",
- "ħ"
- ],
- [
- "oc",
- "ol"
- ],
- [
- "æIJľ",
- "ç´¢"
- ],
- [
- "it",
- "le"
- ],
- [
- "éĨ",
- "Ĵ"
- ],
- [
- "顾",
- "客"
- ],
- [
- "éĢ»",
- "è¾ij"
- ],
- [
- "éĿŀ常",
- "éĩįè¦ģ"
- ],
- [
- "ĠB",
- "i"
- ],
- [
- "å·¦",
- "åı³"
- ],
- [
- "am",
- "m"
- ],
- [
- "Ġevery",
- "thing"
- ],
- [
- "æĺ",
- "ł"
- ],
- [
- "Ġincre",
- "d"
- ],
- [
- "Ġpe",
- "ace"
- ],
- [
- "èľ",
- "ľ"
- ],
- [
- "Ġmuse",
- "um"
- ],
- [
- "çĭ¬",
- "ç«ĭ"
- ],
- [
- "Ġcomprehens",
- "ive"
- ],
- [
- "Ġr",
- "ates"
- ],
- [
- "/",
- "/"
- ],
- [
- "Ġra",
- "d"
- ],
- [
- "åĦ¿",
- "ç«¥"
- ],
- [
- "çī¹",
- "èī²"
- ],
- [
- "ĠPredict",
- "ive"
- ],
- [
- "å¼ķ",
- "åĬĽ"
- ],
- [
- "l",
- "er"
- ],
- [
- "å°",
- "¤"
- ],
- [
- "ic",
- "ro"
- ],
- [
- "è¡",
- "¥"
- ],
- [
- "Ġdeterm",
- "ine"
- ],
- [
- "çļĦ",
- "åĨħ容"
- ],
- [
- "Ġcom",
- "pl"
- ],
- [
- "Ġgreen",
- "house"
- ],
- [
- "èħ",
- "IJ"
- ],
- [
- "Ġhigh",
- "light"
- ],
- [
- "Ġpart",
- "ners"
- ],
- [
- "Ġdo",
- "ct"
- ],
- [
- "çļĦ",
- "使ç͍"
- ],
- [
- "æŃĮ",
- "æĽ²"
- ],
- [
- "æĮĩ",
- "åįĹ"
- ],
- [
- "ĠA",
- "f"
- ],
- [
- "æľº",
- "æŀĦ"
- ],
- [
- "éĢ",
- "Ģ"
- ],
- [
- "Ġpoem",
- "s"
- ],
- [
- "å¿ĥ",
- "åĴĮ"
- ],
- [
- "Ġatt",
- "end"
- ],
- [
- "çļĦæ¸",
- "¸"
- ],
- [
- "Ġs",
- "ide"
- ],
- [
- "al",
- "es"
- ],
- [
- "Ġmention",
- "ed"
- ],
- [
- "ĠA",
- "bs"
- ],
- [
- "Ġhistor",
- "ical"
- ],
- [
- "Ġle",
- "ft"
- ],
- [
- "以ä¸ĭ",
- "åĩłä¸ª"
- ],
- [
- "åıĹ",
- "欢è¿İ"
- ],
- [
- "èıľ",
- "åĵģ"
- ],
- [
- "Ġrem",
- "ain"
- ],
- [
- "æ",
- "ĩ"
- ],
- [
- "Ġtour",
- "s"
- ],
- [
- "ł",
- "éģĵ"
- ],
- [
- "Ġerr",
- "ors"
- ],
- [
- "æľº",
- "åζ"
- ],
- [
- "æ",
- "¦"
- ],
- [
- "æĤ£",
- "èĢħ"
- ],
- [
- "m",
- "ore"
- ],
- [
- "Ġexpert",
- "s"
- ],
- [
- "çļĦçł",
- "Ķç©¶"
- ],
- [
- "ç»ĵ",
- "æĿŁ"
- ],
- [
- "Ġwrit",
- "ten"
- ],
- [
- "çł",
- "Ķ"
- ],
- [
- "Ġe",
- "t"
- ],
- [
- "in",
- "put"
- ],
- [
- "æ°Ķ",
- "ä½ĵ"
- ],
- [
- "è",
- "ļ"
- ],
- [
- "æĥ",
- "Ĭ"
- ],
- [
- "Ġa",
- "ge"
- ],
- [
- "éĩį",
- "å¤į"
- ],
- [
- "å¼",
- "¹"
- ],
- [
- "åŃ",
- "¤"
- ],
- [
- "Ġsympt",
- "oms"
- ],
- [
- "Ġbelie",
- "f"
- ],
- [
- "'",
- "d"
- ],
- [
- "i",
- "ol"
- ],
- [
- "Ġ1",
- "8"
- ],
- [
- "åħħ",
- "è¶³"
- ],
- [
- "çı",
- "į"
- ],
- [
- "force",
- "ment"
- ],
- [
- "æĸ",
- "Ĺ"
- ],
- [
- "ª",
- "èĮĦ"
- ],
- [
- "Ġ1",
- "5"
- ],
- [
- "ä¸Ģ个",
- "人"
- ],
- [
- "Ġapp",
- "lic"
- ],
- [
- "è´",
- "¥"
- ],
- [
- "ä½į",
- "äºİ"
- ],
- [
- "éϤ",
- "äºĨ"
- ],
- [
- "=",
- "\""
- ],
- [
- "ä¸ī",
- "è§Ĵ"
- ],
- [
- "æĢĿ",
- "ç»´"
- ],
- [
- "åį",
- "·"
- ],
- [
- "Ġf",
- "ru"
- ],
- [
- "ĠCol",
- "labor"
- ],
- [
- "Ġpr",
- "im"
- ],
- [
- "Ġrequire",
- "d"
- ],
- [
- "Ġw",
- "atch"
- ],
- [
- "è°ĥ",
- "åij³"
- ],
- [
- "ç»ĵ",
- "论"
- ],
- [
- "on",
- "y"
- ],
- [
- "Ġgu",
- "ide"
- ],
- [
- "Ġm",
- "ax"
- ],
- [
- "ĠC",
- "ould"
- ],
- [
- "Ġadv",
- "ent"
- ],
- [
- "ĠO",
- "verall"
- ],
- [
- "çļĦæĬ",
- "ķ"
- ],
- [
- "Ġexp",
- "er"
- ],
- [
- "å",
- "ĺ"
- ],
- [
- "ic",
- "ial"
- ],
- [
- "ost",
- "er"
- ],
- [
- "çļĦ",
- "é¢ľèī²"
- ],
- [
- "Ġoper",
- "ations"
- ],
- [
- "éĥ",
- "ģ"
- ],
- [
- "Ġm",
- "oney"
- ],
- [
- "le",
- "y"
- ],
- [
- "c",
- "ling"
- ],
- [
- "Ġo",
- "il"
- ],
- [
- "çļ®",
- "èĤ¤"
- ],
- [
- "Ġg",
- "e"
- ],
- [
- "Ġb",
- "at"
- ],
- [
- "ĠP",
- "h"
- ],
- [
- "Ġsc",
- "he"
- ],
- [
- "Ġelect",
- "ric"
- ],
- [
- "v",
- "est"
- ],
- [
- "Ġch",
- "ain"
- ],
- [
- "Ġcap",
- "abilities"
- ],
- [
- "ir",
- "d"
- ],
- [
- "è¯ģ",
- "æĺİ"
- ],
- [
- "æľĢ",
- "好"
- ],
- [
- "iv",
- "il"
- ],
- [
- "Ġdepend",
- "ing"
- ],
- [
- "Ġs",
- "ave"
- ],
- [
- "Ġpract",
- "ical"
- ],
- [
- "Ġcult",
- "ures"
- ],
- [
- "缸åºĶ",
- "çļĦ"
- ],
- [
- "s",
- "y"
- ],
- [
- "çļĦç",
- "²"
- ],
- [
- "Ġbeh",
- "ind"
- ],
- [
- "æĹ¶éĹ´",
- "åĴĮ"
- ],
- [
- "å¹",
- "ħ"
- ],
- [
- "ĠA",
- "g"
- ],
- [
- "Ġeffect",
- "iveness"
- ],
- [
- "A",
- "d"
- ],
- [
- "ĠO",
- "f"
- ],
- [
- "Ġany",
- "thing"
- ],
- [
- "å·§åħĭ",
- "åĬĽ"
- ],
- [
- "Ġm",
- "ist"
- ],
- [
- "Ġlangu",
- "ages"
- ],
- [
- "ĠM",
- "ake"
- ],
- [
- "å",
- "«"
- ],
- [
- "æ£",
- "®"
- ],
- [
- "ĠCon",
- "t"
- ],
- [
- "ĠAbs",
- "olutely"
- ],
- [
- "Ġinvest",
- "ment"
- ],
- [
- "m",
- "at"
- ],
- [
- "çļĦæķħ",
- "äºĭ"
- ],
- [
- "æ¬",
- "§"
- ],
- [
- "Ġspe",
- "ed"
- ],
- [
- "çļĦæ¸",
- "©"
- ],
- [
- "Ġc",
- "ities"
- ],
- [
- "åĨĻ",
- "ä½ľ"
- ],
- [
- "Th",
- "anks"
- ],
- [
- "Ġd",
- "ed"
- ],
- [
- "åĪĨ",
- "éħį"
- ],
- [
- "Ġd",
- "ark"
- ],
- [
- "Ġsupport",
- "ing"
- ],
- [
- "å¹",
- "ķ"
- ],
- [
- "ĠK",
- "e"
- ],
- [
- "éĽ",
- "¶"
- ],
- [
- "Ġsh",
- "aring"
- ],
- [
- "Ġh",
- "ouse"
- ],
- [
- "认",
- "çŁ¥"
- ],
- [
- "Ġsurround",
- "ing"
- ],
- [
- "Ġredu",
- "ced"
- ],
- [
- "Ġf",
- "u"
- ],
- [
- "Ġst",
- "or"
- ],
- [
- "Ġab",
- "s"
- ],
- [
- "T",
- "om"
- ],
- [
- "c",
- "ent"
- ],
- [
- "ĠEduc",
- "ation"
- ],
- [
- "Ġth",
- "r"
- ],
- [
- "ot",
- "t"
- ],
- [
- "ĠTh",
- "at"
- ],
- [
- "Ġhe",
- "ar"
- ],
- [
- "un",
- "g"
- ],
- [
- "Ġbe",
- "yond"
- ],
- [
- "ĠC",
- "o"
- ],
- [
- "ro",
- "om"
- ],
- [
- "è¯Ĺ",
- "æŃĮ"
- ],
- [
- "re",
- "me"
- ],
- [
- "Ġlit",
- "tle"
- ],
- [
- "Ġg",
- "ames"
- ],
- [
- "ä¹ĭ",
- "åIJİ"
- ],
- [
- "éĥ½",
- "ä¼ļ"
- ],
- [
- "è¯Ń",
- "éŁ³"
- ],
- [
- "ç¬",
- "ij"
- ],
- [
- "çī¹",
- "å®ļ"
- ],
- [
- "第",
- "ä¸Ģ"
- ],
- [
- "Ġdep",
- "ression"
- ],
- [
- "Ġinnov",
- "ation"
- ],
- [
- "ĠF",
- "r"
- ],
- [
- "Ġcomput",
- "er"
- ],
- [
- "c",
- "an"
- ],
- [
- "å³",
- "°"
- ],
- [
- "ç¼ĸåĨĻ",
- "ä¸Ģ个"
- ],
- [
- "Ġintern",
- "ational"
- ],
- [
- "Ġcan",
- "cer"
- ],
- [
- "åѦ",
- "èĢħ"
- ],
- [
- "Ġdisc",
- "over"
- ],
- [
- "he",
- "t"
- ],
- [
- "Ġcomp",
- "os"
- ],
- [
- "Ġrec",
- "y"
- ],
- [
- "Ġ2",
- "00"
- ],
- [
- "åIJ«",
- "æľī"
- ],
- [
- "çĹ",
- "Ľ"
- ],
- [
- "ç¼ĵ",
- "è§£"
- ],
- [
- "Ġfre",
- "qu"
- ],
- [
- "çĶ",
- "³"
- ],
- [
- "ĠM",
- "ar"
- ],
- [
- "çļĦ",
- "éĢīæĭ©"
- ],
- [
- "Ġun",
- "t"
- ],
- [
- "Ġreg",
- "ions"
- ],
- [
- "Ġop",
- "in"
- ],
- [
- "ĠGovern",
- "ments"
- ],
- [
- "æ¶",
- "Ĥ"
- ],
- [
- "åĨħ",
- "å¿ĥ"
- ],
- [
- "ä¸Ĭ",
- "æľĢ"
- ],
- [
- "ä»į",
- "çĦ¶"
- ],
- [
- "l",
- "ier"
- ],
- [
- "æ³",
- "³"
- ],
- [
- "äºĴ",
- "缸"
- ],
- [
- "ĠSt",
- "ud"
- ],
- [
- "az",
- "on"
- ],
- [
- "Ġar",
- "ch"
- ],
- [
- "Ġche",
- "m"
- ],
- [
- "çļĦ",
- "èĥ½åĬĽ"
- ],
- [
- "çļĦ",
- "ä¸Ģ个"
- ],
- [
- "Ġa",
- "p"
- ],
- [
- "Ġre",
- "d"
- ],
- [
- "Ġw",
- "omen"
- ],
- [
- "Ġpro",
- "te"
- ],
- [
- "Ġfind",
- "ing"
- ],
- [
- "å§",
- "»"
- ],
- [
- "éĢĤå½ĵ",
- "çļĦ"
- ],
- [
- "Ġfor",
- "ward"
- ],
- [
- "对",
- "象"
- ],
- [
- "Ġwa",
- "it"
- ],
- [
- "Ġconsid",
- "ered"
- ],
- [
- "du",
- "le"
- ],
- [
- "b",
- "acks"
- ],
- [
- "Ġclin",
- "ical"
- ],
- [
- "åħ·",
- "å¤ĩ"
- ],
- [
- "éº",
- "¦"
- ],
- [
- "Ġon",
- "going"
- ],
- [
- "åĨ",
- "Ľ"
- ],
- [
- "Ġf",
- "ar"
- ],
- [
- "åĴĮ",
- "è°"
- ],
- [
- "XX",
- "X"
- ],
- [
- "Ġpolit",
- "ical"
- ],
- [
- "Ġcam",
- "er"
- ],
- [
- "çļĦ",
- "è¡Į为"
- ],
- [
- "æĦı",
- "大åĪ©"
- ],
- [
- "Ġapp",
- "s"
- ],
- [
- "åĩı",
- "è½»"
- ],
- [
- "Ġread",
- "ers"
- ],
- [
- "å©ļ",
- "å§»"
- ],
- [
- "æ°",
- "¸"
- ],
- [
- "o",
- "res"
- ],
- [
- "åħ¨",
- "éĿ¢"
- ],
- [
- "ĠAf",
- "ric"
- ],
- [
- "Ġfavor",
- "ite"
- ],
- [
- "Ġm",
- "ill"
- ],
- [
- "Ġd",
- "ang"
- ],
- [
- "ĠSt",
- "ates"
- ],
- [
- "åĢ",
- "Ł"
- ],
- [
- "å¯",
- "¿"
- ],
- [
- "Ġl",
- "at"
- ],
- [
- "è¿ĩ",
- "åİ»"
- ],
- [
- "Ġtr",
- "uly"
- ],
- [
- "åĽŀçŃĶ",
- "éĹ®é¢ĺ"
- ],
- [
- "Ġco",
- "gn"
- ],
- [
- "ä»",
- "°"
- ],
- [
- "ĠJ",
- "apan"
- ],
- [
- "iz",
- "z"
- ],
- [
- "çļĦæĿ",
- "IJ"
- ],
- [
- "x",
- "x"
- ],
- [
- "é¢ĺ",
- "缮"
- ],
- [
- "ri",
- "ption"
- ],
- [
- "éĤ£",
- "äºĽ"
- ],
- [
- "Ġbud",
- "get"
- ],
- [
- "Ġv",
- "ast"
- ],
- [
- "éļIJ",
- "ç§ģ"
- ],
- [
- "Ġpolic",
- "ymakers"
- ],
- [
- "è¿ĺ",
- "éľĢè¦ģ"
- ],
- [
- "å¹¶",
- "æıIJä¾Ľ"
- ],
- [
- "Ġswe",
- "et"
- ],
- [
- "Ġgener",
- "al"
- ],
- [
- "æ»",
- "¤"
- ],
- [
- "Ġbir",
- "ds"
- ],
- [
- "Ġpl",
- "astic"
- ],
- [
- "Ċ",
- "ĉ"
- ],
- [
- "åĪ",
- "º"
- ],
- [
- "ment",
- "al"
- ],
- [
- "Ġincl",
- "usive"
- ],
- [
- "Ġtop",
- "ics"
- ],
- [
- "Ġs",
- "low"
- ],
- [
- "ä½ł",
- "èĥ½"
- ],
- [
- "è¶³å¤Ł",
- "çļĦ"
- ],
- [
- "è§Ĩ",
- "è§ī"
- ],
- [
- "w",
- "w"
- ],
- [
- "Ġ",
- "使ç͍"
- ],
- [
- "æī",
- "¹"
- ],
- [
- "æ¦Ĥ",
- "念"
- ],
- [
- "é£Ł",
- "ç͍"
- ],
- [
- "èĢ",
- "³"
- ],
- [
- "c",
- "ks"
- ],
- [
- "Ġfra",
- "ud"
- ],
- [
- "Ġingred",
- "ients"
- ],
- [
- "Ġf",
- "asc"
- ],
- [
- "åĮĹ",
- "京"
- ],
- [
- "Ġf",
- "r"
- ],
- [
- "Ġmanufact",
- "uring"
- ],
- [
- "Ġ",
- "ä½ľä¸º"
- ],
- [
- "Ġbe",
- "ach"
- ],
- [
- "é¡",
- "¿"
- ],
- [
- "eri",
- "ous"
- ],
- [
- "å¤ĸ",
- "è§Ĥ"
- ],
- [
- "é¢Ħ",
- "éĺ²"
- ],
- [
- "æĿ¥",
- "èĩª"
- ],
- [
- "èĤĮ",
- "èĤī"
- ],
- [
- "Ġd",
- "ays"
- ],
- [
- "Ġass",
- "ign"
- ],
- [
- "Ġadv",
- "ant"
- ],
- [
- "Ġteam",
- "s"
- ],
- [
- "é¢",
- "Ĺ"
- ],
- [
- "now",
- "n"
- ],
- [
- "ĠP",
- "o"
- ],
- [
- "}",
- "{"
- ],
- [
- "Ġmin",
- "ut"
- ],
- [
- "it",
- "ions"
- ],
- [
- "Ġeas",
- "ily"
- ],
- [
- "ĠB",
- "l"
- ],
- [
- "n",
- "ame"
- ],
- [
- "åѦ",
- "æł¡"
- ],
- [
- "Ġrespons",
- "ibility"
- ],
- [
- "åıij",
- "æĮ¥"
- ],
- [
- "Ġsens",
- "itive"
- ],
- [
- "çŃī",
- "äºİ"
- ],
- [
- "ci",
- "ous"
- ],
- [
- "Ġs",
- "ou"
- ],
- [
- "å±",
- "ı"
- ],
- [
- "Ġr",
- "ich"
- ],
- [
- "å½ĵ",
- "çĦ¶"
- ],
- [
- "m",
- "an"
- ],
- [
- "Ġinterpre",
- "t"
- ],
- [
- "2",
- "4"
- ],
- [
- "Ġshow",
- "s"
- ],
- [
- "èģĮ",
- "åľº"
- ],
- [
- "Ġf",
- "all"
- ],
- [
- "è½",
- "½"
- ],
- [
- "丰å¯Į",
- "çļĦ"
- ],
- [
- "(",
- "'"
- ],
- [
- "ä¿®",
- "æĶ¹"
- ],
- [
- "æĽ´",
- "æį¢"
- ],
- [
- "A",
- "l"
- ],
- [
- "åı¯èĥ½",
- "æĺ¯"
- ],
- [
- "Ġr",
- "ate"
- ],
- [
- "Ġprotect",
- "ing"
- ],
- [
- "f",
- "it"
- ],
- [
- "Ġ5",
- "0"
- ],
- [
- "Ġmove",
- "ment"
- ],
- [
- "è§",
- "Ī"
- ],
- [
- "Ġemploy",
- "ee"
- ],
- [
- "Ġdis",
- "ord"
- ],
- [
- "åĪĽ",
- "æĦı"
- ],
- [
- "产åĵģ",
- "çļĦ"
- ],
- [
- "æľ",
- "Ŀ"
- ],
- [
- "ĊĠĠĠĠĠĠĠĠ",
- "ĠĠĠĠĠĠĠ"
- ],
- [
- "Ġpre",
- "d"
- ],
- [
- "Ġoffer",
- "ing"
- ],
- [
- "åįģ",
- "åĪĨ"
- ],
- [
- "èĢĮ",
- "ä¸įæĺ¯"
- ],
- [
- "Th",
- "ank"
- ],
- [
- "æĽ",
- "¾"
- ],
- [
- "Ġele",
- "ments"
- ],
- [
- "ç²",
- "Ĵ"
- ],
- [
- "Ġcour",
- "ses"
- ],
- [
- "Ġintegr",
- "ated"
- ],
- [
- "ĠC",
- "ar"
- ],
- [
- "agra",
- "ph"
- ],
- [
- "åŁº",
- "åĽł"
- ],
- [
- "Ġinst",
- "ead"
- ],
- [
- "èĦ",
- "±"
- ],
- [
- "åı¦",
- "ä¸Ģ个"
- ],
- [
- "å¯Ĩ",
- "çłģ"
- ],
- [
- "Ġallow",
- "ed"
- ],
- [
- "éĿ¢",
- "åĮħ"
- ],
- [
- "çķ",
- "ªèĮĦ"
- ],
- [
- "åĴĮ",
- "åıijå±ķ"
- ],
- [
- "å°",
- "ģ"
- ],
- [
- "Ġconnect",
- "ion"
- ],
- [
- "åľ¨",
- "ä¸Ģ个"
- ],
- [
- "Ġuse",
- "ful"
- ],
- [
- "è¯Ń",
- "åı¥"
- ],
- [
- "åĪĨ",
- "å¸ĥ"
- ],
- [
- "表",
- "æ¼Ķ"
- ],
- [
- "æľī",
- "æĹ¶"
- ],
- [
- "çļĦæĹ",
- "ħ"
- ],
- [
- "çļĦæĢ",
- "»"
- ],
- [
- "Ġf",
- "ashion"
- ],
- [
- "èĭ",
- "¦"
- ],
- [
- "è¦ģ",
- "注æĦı"
- ],
- [
- "çĶŁ",
- "ç´ł"
- ],
- [
- "Ġnut",
- "ri"
- ],
- [
- "èĩª",
- "è¡Į"
- ],
- [
- "çļĦç",
- "ĭ"
- ],
- [
- "çIJĨè§£",
- "åĴĮ"
- ],
- [
- "Ġc",
- "at"
- ],
- [
- "æľºåύ",
- "åŃ¦ä¹ł"
- ],
- [
- "Ġexh",
- "ib"
- ],
- [
- "åĴĮ",
- "æľįåĬ¡"
- ],
- [
- "fra",
- "c"
- ],
- [
- "e",
- "pend"
- ],
- [
- "Ġimpact",
- "ed"
- ],
- [
- "Ġ",
- "ut"
- ],
- [
- "æķ°",
- "ç»Ħ"
- ],
- [
- "ĠWor",
- "ld"
- ],
- [
- "Ġansw",
- "er"
- ],
- [
- "ers",
- "e"
- ],
- [
- "éª",
- "¨"
- ],
- [
- "Ġart",
- "ists"
- ],
- [
- "åŃ©åŃIJ",
- "çļĦ"
- ],
- [
- "ä»",
- "Ķ"
- ],
- [
- "çĻ",
- "»"
- ],
- [
- "ĠA",
- "re"
- ],
- [
- "Ġco",
- "ol"
- ],
- [
- "Ġcogn",
- "itive"
- ],
- [
- "åIJĦ",
- "个"
- ],
- [
- "l",
- "ike"
- ],
- [
- "å©´",
- "åĦ¿"
- ],
- [
- "åĪĹ",
- "åĩº"
- ],
- [
- "å¹",
- "»"
- ],
- [
- "ron",
- "t"
- ],
- [
- "å®¶",
- "éķ¿"
- ],
- [
- "缺",
- "ä¹ı"
- ],
- [
- "Ġcy",
- "ber"
- ],
- [
- "il",
- "t"
- ],
- [
- "Ġcapt",
- "ure"
- ],
- [
- "å",
- "Ĺ"
- ],
- [
- "åľ¨",
- "äºİ"
- ],
- [
- "Ġthreat",
- "s"
- ],
- [
- "åĴĮ",
- "社ä¼ļ"
- ],
- [
- "Ġcell",
- "s"
- ],
- [
- "æ¸ħ",
- "åįķ"
- ],
- [
- "ĠV",
- "is"
- ],
- [
- "æİ",
- "ī"
- ],
- [
- "Ġh",
- "ol"
- ],
- [
- "åŃIJ",
- "çļĦ"
- ],
- [
- "C",
- "h"
- ],
- [
- "è",
- "Ŀ"
- ],
- [
- "Ġs",
- "aid"
- ],
- [
- "Ġd",
- "ream"
- ],
- [
- "un",
- "ch"
- ],
- [
- "un",
- "e"
- ],
- [
- "ĠD",
- "on"
- ],
- [
- "å®¶",
- "人"
- ],
- [
- "ç±",
- "į"
- ],
- [
- "æĦŁ",
- "åĴĮ"
- ],
- [
- "Ġexperi",
- "enced"
- ],
- [
- "çļĦéĩįè¦ģ",
- "æĢ§"
- ],
- [
- "å¼",
- "ĥ"
- ],
- [
- "um",
- "p"
- ],
- [
- "éĺ",
- "IJ"
- ],
- [
- "Ġhabit",
- "at"
- ],
- [
- "è¢",
- "ĭ"
- ],
- [
- "Ġj",
- "o"
- ],
- [
- "ç®Ģ",
- "æ´ģ"
- ],
- [
- "Ġb",
- "ur"
- ],
- [
- "Ġvisit",
- "ors"
- ],
- [
- "éĽ",
- "ħ"
- ],
- [
- "çļĦçŁ",
- "¥"
- ],
- [
- "Ġent",
- "ire"
- ],
- [
- "讲",
- "è¿°"
- ],
- [
- "äºĨ",
- "ä¸ĢäºĽ"
- ],
- [
- "åįı",
- "ä½ľ"
- ],
- [
- "ĠB",
- "us"
- ],
- [
- "å°",
- "¾"
- ],
- [
- "çļĦæķ",
- "Ļ"
- ],
- [
- "olo",
- "g"
- ],
- [
- "Ġsign",
- "s"
- ],
- [
- "Ġspeak",
- "er"
- ],
- [
- "çļĦ",
- "éŁ³ä¹IJ"
- ],
- [
- "Ġno",
- "vel"
- ],
- [
- "å±ħ",
- "æ°ij"
- ],
- [
- "çļĦ",
- "åıĺåĮĸ"
- ],
- [
- "å°½",
- "éĩı"
- ],
- [
- "Ġspir",
- "it"
- ],
- [
- "å®Į",
- "ç¾İ"
- ],
- [
- "è´",
- "·"
- ],
- [
- "å¿ħè¦ģ",
- "çļĦ"
- ],
- [
- "ie",
- "f"
- ],
- [
- "示",
- "ä¾ĭ"
- ],
- [
- "Ġd",
- "iv"
- ],
- [
- "æķ´",
- "æķ°"
- ],
- [
- "Ġeconom",
- "y"
- ],
- [
- "Ġethical",
- "ly"
- ],
- [
- "éĻ",
- "Ī"
- ],
- [
- "Ġschool",
- "s"
- ],
- [
- "Ġnet",
- "works"
- ]
- ]
- }
-}
\ No newline at end of file
diff --git a/model/tokenizer_config.json b/model/tokenizer_config.json
deleted file mode 100644
index fc4e726..0000000
--- a/model/tokenizer_config.json
+++ /dev/null
@@ -1,43 +0,0 @@
-{
- "add_bos_token": false,
- "add_eos_token": false,
- "add_prefix_space": false,
- "added_tokens_decoder": {
- "0": {
- "content": "<|endoftext|>",
- "lstrip": false,
- "normalized": false,
- "rstrip": false,
- "single_word": false,
- "special": true
- },
- "1": {
- "content": "<|im_start|>",
- "lstrip": false,
- "normalized": false,
- "rstrip": false,
- "single_word": false,
- "special": true
- },
- "2": {
- "content": "<|im_end|>",
- "lstrip": false,
- "normalized": false,
- "rstrip": false,
- "single_word": false,
- "special": true
- }
- },
- "additional_special_tokens": [],
- "bos_token": "<|im_start|>",
- "clean_up_tokenization_spaces": false,
- "eos_token": "<|im_end|>",
- "legacy": true,
- "model_max_length": 32768,
- "pad_token": "<|endoftext|>",
- "sp_model_kwargs": {},
- "spaces_between_special_tokens": false,
- "tokenizer_class": "PreTrainedTokenizerFast",
- "unk_token": "<|endoftext|>",
- "chat_template": "{%- if tools %}\n {{- '<|im_start|>system\\n' }}\n {%- if messages[0].role == 'system' %}\n {{- messages[0].content + '\\n\\n' }}\n {%- endif %}\n {{- \"# Tools\\n\\nYou may call one or more functions to assist with the user query.\\n\\nYou are provided with function signatures within XML tags:\\n\" }}\n {%- for tool in tools %}\n {{- \"\\n\" }}\n {{- tool | tojson }}\n {%- endfor %}\n {{- \"\\n \\n\\nFor each function call, return a json object with function name and arguments within XML tags:\\n\\n{\\\"name\\\": , \\\"arguments\\\": }\\n <|im_end|>\\n\" }}\n{%- else %}\n {%- if messages[0]['role'] == 'system' -%}\n {{- '<|im_start|>system\\n' + messages[0]['content'] + '<|im_end|>\\n' }}\n {%- else -%}\n {{- '<|im_start|>system\\nYou are a helpful assistant<|im_end|>\\n' }}\n {%- endif %}\n{%- endif %}\n{%- set ns = namespace(multi_step_tool=true, last_query_index=messages|length - 1) %}\n{%- for message in messages[::-1] %}\n {%- set index = (messages|length - 1) - loop.index0 %}\n {%- if ns.multi_step_tool and message.role == \"user\" and message.content is string and not(message.content.startswith('') and message.content.endswith(' ')) %}\n {%- set ns.multi_step_tool = false %}\n {%- set ns.last_query_index = index %}\n {%- endif %}\n{%- endfor %}\n{%- for message in messages %}\n {%- if message.content is string %}\n {%- set content = message.content %}\n {%- else %}\n {%- set content = '' %}\n {%- endif %}\n {%- if (message.role == \"user\") or (message.role == \"system\" and not loop.first) %}\n {{- '<|im_start|>' + message.role + '\\n' + content + '<|im_end|>' + '\\n' }}\n {%- elif message.role == \"assistant\" %}\n {{- '<|im_start|>' + message.role + '\\n' + content }}\n {%- if message.tool_calls %}\n {%- for tool_call in message.tool_calls %}\n {%- if (loop.first and content) or (not loop.first) %}\n {{- '\\n' }}\n {%- endif %}\n {%- if tool_call.function %}\n {%- set tool_call = tool_call.function %}\n {%- endif %}\n {{- '\\n{\"name\": \"' }}\n {{- tool_call.name }}\n {{- '\", \"arguments\": ' }}\n {%- if tool_call.arguments is string %}\n {{- tool_call.arguments }}\n {%- else %}\n {{- tool_call.arguments | tojson }}\n {%- endif %}\n {{- '}\\n ' }}\n {%- endfor %}\n {%- endif %}\n {{- '<|im_end|>\\n' }}\n {%- elif message.role == \"tool\" %}\n {%- if loop.first or (messages[loop.index0 - 1].role != \"tool\") %}\n {{- '<|im_start|>user' }}\n {%- endif %}\n {{- '\\n\\n' }}\n {{- content }}\n {{- '\\n ' }}\n {%- if loop.last or (messages[loop.index0 + 1].role != \"tool\") %}\n {{- '<|im_end|>\\n' }}\n {%- endif %}\n {%- endif %}\n{%- endfor %}\n{%- if add_generation_prompt %}\n {{- '<|im_start|>assistant\\n' }}\n {%- if enable_thinking is defined and enable_thinking is false %}\n {{- '\\n\\n \\n\\n' }}\n {%- endif %}\n{%- endif %}"
-}
\ No newline at end of file
diff --git a/model/vision_model/README.md b/model/vision_model/README.md
deleted file mode 100644
index 31c8c59..0000000
--- a/model/vision_model/README.md
+++ /dev/null
@@ -1,11 +0,0 @@
-需要把`clip-vit-base-patch16`模型下载到此目录下
-
-```text
-git clone https://huggingface.co/openai/clip-vit-base-patch16
-```
-
-or
-
-```text
-git clone https://www.modelscope.cn/openai-mirror/clip-vit-base-patch16
-```
\ No newline at end of file
diff --git a/requirements.txt b/requirements.txt
index 7f4f0d9..a007367 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -1,31 +1,2 @@
-datasets==3.6.0
-datasketch==1.6.4
-Flask==3.0.3
-Flask_Cors==4.0.0
-jieba==0.42.1
-jsonlines==4.0.0
-marshmallow==3.22.0
-matplotlib==3.5.1
-ngrok==1.4.0
-nltk==3.8
-numpy==1.26.4
-openai==1.59.6
-peft==0.7.1
-psutil==5.9.8
-pydantic==2.8.2
-rich==13.7.1
-scikit_learn==1.5.1
-sentence_transformers==2.3.1
-simhash==2.1.2
-tiktoken==0.10.0
-transformers==4.57.1
-jinja2==3.1.2
-jsonlines==4.0.0
-trl==0.13.0
-ujson==5.1.0
-wandb==0.18.3
-streamlit==1.50.0
-gradio==5.49.0
-swanlab==0.6.12
-torch==2.6.0
-torchvision==0.21.0
+mkdocs>=1.5.0
+mkdocs-material>=9.0.0
diff --git a/scripts/convert_vlm.py b/scripts/convert_vlm.py
deleted file mode 100644
index 9d3e1b5..0000000
--- a/scripts/convert_vlm.py
+++ /dev/null
@@ -1,44 +0,0 @@
-import os
-import sys
-
-__package__ = "scripts"
-sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), '..')))
-import torch
-import warnings
-from transformers import AutoTokenizer, AutoModelForCausalLM, LlamaConfig, LlamaForCausalLM
-from model.model_vlm import MiniMindVLM, VLMConfig
-
-warnings.filterwarnings('ignore', category=UserWarning)
-
-
-def convert_torch2transformers_minimind(torch_path, transformers_path, dtype=torch.bfloat16):
- VLMConfig.register_for_auto_class()
- MiniMindVLM.register_for_auto_class("AutoModelForCausalLM")
- lm_model = MiniMindVLM(lm_config, vision_model_path="../model/vision_model/clip-vit-base-patch16")
- device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
- state_dict = torch.load(torch_path, map_location=device)
- lm_model.load_state_dict(state_dict, strict=False)
- lm_model = lm_model.to(dtype) # 转换模型权重精度
- model_params = sum(p.numel() for p in lm_model.parameters() if p.requires_grad)
- print(f'模型参数: {model_params / 1e6} 百万 = {model_params / 1e9} B (Billion)')
- del lm_model.vision_encoder
- lm_model.save_pretrained(transformers_path, safe_serialization=False)
- tokenizer = AutoTokenizer.from_pretrained('../model/')
- tokenizer.save_pretrained(transformers_path)
- print(f"模型已保存为 Transformers-MiniMind-V 格式: {transformers_path}")
-
-
-def convert_transformers2torch(transformers_path, torch_path):
- model = AutoModelForCausalLM.from_pretrained(transformers_path, trust_remote_code=True)
- torch.save(model.state_dict(), torch_path)
- print(f"模型已保存为 PyTorch 格式: {torch_path}")
-
-
-if __name__ == '__main__':
- lm_config = VLMConfig(hidden_size=768, num_hidden_layers=16, max_seq_len=8192, use_moe=False)
-
- torch_path = f"../out/sft_vlm_{lm_config.hidden_size}{'_moe' if lm_config.use_moe else ''}.pth"
-
- transformers_path = '../MiniMind2-V'
-
- convert_torch2transformers_minimind(torch_path, transformers_path)
diff --git a/scripts/web_demo_vlm.py b/scripts/web_demo_vlm.py
deleted file mode 100644
index db867a4..0000000
--- a/scripts/web_demo_vlm.py
+++ /dev/null
@@ -1,198 +0,0 @@
-import os
-import sys
-
-__package__ = "scripts"
-sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), '..')))
-import argparse
-import torch
-import warnings
-import gradio as gr
-from queue import Queue
-from threading import Thread
-from PIL import Image
-from transformers import AutoTokenizer, AutoModelForCausalLM, TextStreamer
-from model.model_vlm import MiniMindVLM, VLMConfig
-from transformers import logging as hf_logging
-
-hf_logging.set_verbosity_error()
-warnings.filterwarnings('ignore')
-
-
-def init_model(lm_config):
- tokenizer = AutoTokenizer.from_pretrained(args.load_from)
- if 'model' in args.load_from:
- moe_path = '_moe' if lm_config.use_moe else ''
- ckp = f'../{args.save_dir}/{args.weight}_{lm_config.hidden_size}{moe_path}.pth'
- model = MiniMindVLM(lm_config, vision_model_path="../model/vision_model/clip-vit-base-patch16")
- state_dict = torch.load(ckp, map_location=args.device)
- model.load_state_dict({k: v for k, v in state_dict.items() if 'mask' not in k}, strict=False)
- else:
- model = AutoModelForCausalLM.from_pretrained(args.load_from, trust_remote_code=True)
- model.vision_encoder, model.processor = MiniMindVLM.get_vision_model("../model/vision_model/clip-vit-base-patch16")
-
- print(f'VLM参数量:{sum(p.numel() for p in model.parameters() if p.requires_grad) / 1e6:.3f} 百万')
-
- vision_model, preprocess = model.vision_encoder, model.processor
- return model.eval().to(args.device), tokenizer, vision_model.to(args.device), preprocess
-
-
-class CustomStreamer(TextStreamer):
- def __init__(self, tokenizer, queue):
- super().__init__(tokenizer, skip_prompt=True, skip_special_tokens=True)
- self.queue = queue
- self.tokenizer = tokenizer
-
- def on_finalized_text(self, text: str, stream_end: bool = False):
- self.queue.put(text)
- if stream_end:
- self.queue.put(None)
-
-
-def chat(prompt, current_image_path):
- global temperature, top_p
- image = Image.open(current_image_path).convert('RGB')
- pixel_values = MiniMindVLM.image2tensor(image, preprocess).to(model.device).unsqueeze(0)
-
- prompt = f'{lm_config.image_special_token}\n{prompt}'
- messages = [{"role": "user", "content": prompt}]
-
- new_prompt = tokenizer.apply_chat_template(
- messages,
- tokenize=False,
- add_generation_prompt=True
- )[-args.max_seq_len + 1:]
-
- with torch.no_grad():
- inputs = tokenizer(
- new_prompt,
- return_tensors="pt",
- truncation=True
- ).to(args.device)
- queue = Queue()
- streamer = CustomStreamer(tokenizer, queue)
-
- def _generate():
- model.generate(
- inputs.input_ids,
- max_new_tokens=args.max_seq_len,
- do_sample=True,
- temperature=temperature,
- top_p=top_p,
- attention_mask=inputs.attention_mask,
- pad_token_id=tokenizer.pad_token_id,
- eos_token_id=tokenizer.eos_token_id,
- streamer=streamer,
- pixel_values=pixel_values
- )
-
- Thread(target=_generate).start()
-
- while True:
- text = queue.get()
- if text is None:
- break
- yield text
-
-
-def launch_gradio_server(server_name="0.0.0.0", server_port=7788):
- global temperature, top_p
- temperature = args.temperature
- top_p = args.top_p
-
- with gr.Blocks() as demo:
- gr.HTML(f"""
-
-
-
Hi, I'm MiniMind2-V
-
- """)
-
- with gr.Row():
- with gr.Column(scale=3):
- def get_current_image_path(image):
- global current_image_path
- if image is None:
- current_image_path = ''
- return
- current_image_path = image
- return current_image_path
-
- with gr.Blocks() as iface:
- with gr.Row():
- image_input = gr.Image(type="filepath", label="选择图片", height=650)
- image_input.change(fn=get_current_image_path, inputs=image_input)
-
- def update_parameters(temperature_, top_p_):
- global temperature, top_p
- temperature = float(temperature_)
- top_p = float(top_p_)
- return temperature, top_p
-
- with gr.Blocks() as iface_param:
- with gr.Row():
- temperature_slider = gr.Slider(label="Temperature", minimum=0.5, maximum=1.1, value=0.65)
- top_p_slider = gr.Slider(label="Top-P", minimum=0.7, maximum=0.95, value=0.85)
-
- temperature_slider.change(fn=update_parameters, inputs=[temperature_slider, top_p_slider])
- top_p_slider.change(fn=update_parameters, inputs=[temperature_slider, top_p_slider])
-
- with gr.Column(scale=6):
- def chat_with_vlm(message, history):
- if not message:
- yield history + [("错误", "错误:提问不能为空。")]
- return
- if not current_image_path:
- yield history + [("错误", "错误:图片不能为空。")]
- return
-
- image_html = f' '
- res_generator = chat(message, current_image_path)
- response = ''
- for res in res_generator:
- response += res
- yield history + [(f"{image_html} {message}", response)]
-
- chatbot = gr.Chatbot(label="MiniMind-Vision", height=680)
- with gr.Row():
- with gr.Column(scale=8):
- message_input = gr.Textbox(
- placeholder="请输入你的问题...",
- show_label=False,
- container=False
- )
- with gr.Column(scale=2, min_width=50):
- submit_button = gr.Button("发送")
- submit_button.click(
- fn=chat_with_vlm,
- inputs=[message_input, chatbot],
- outputs=chatbot
- )
-
- # # 添加示例问题
- # gr.Examples(
- # examples=["描述一下这个图片的内容。", "画面里面有什么?", "画面里的天气怎么样?"],
- # inputs=message_input)
-
- demo.launch(server_name=server_name, server_port=server_port)
-
-
-if __name__ == '__main__':
- parser = argparse.ArgumentParser(description="Chat with MiniMind")
- parser.add_argument('--load_from', default='../model', type=str, help="模型加载路径(model=原生torch权重,其他路径=transformers格式)")
- parser.add_argument('--save_dir', default='out', type=str, help="模型权重目录")
- parser.add_argument('--weight', default='sft_vlm', type=str, help="权重名称前缀(pretrain_vlm, sft_vlm)")
- parser.add_argument('--temperature', default=0.65, type=float, help="生成温度,控制随机性(0-1,越大越随机)")
- parser.add_argument('--top_p', default=0.85, type=float, help="nucleus采样阈值(0-1)")
- parser.add_argument('--device', default='cuda' if torch.cuda.is_available() else 'cpu', type=str, help="运行设备")
- parser.add_argument('--hidden_size', default=512, type=int, help="隐藏层维度(512=Small-26M, 768=Base-104M)")
- parser.add_argument('--num_hidden_layers', default=8, type=int, help="隐藏层数量(Small=8, Base=16)")
- parser.add_argument('--max_seq_len', default=8192, type=int, help="最大序列长度")
- parser.add_argument('--use_moe', default=0, type=int, choices=[0, 1], help="是否使用MoE架构(0=否,1=是)")
- parser.add_argument('--stream', default=1, type=int, choices=[0, 1], help="是否使用流式输出(0=否,1=是)")
- args = parser.parse_args()
-
- lm_config = VLMConfig(hidden_size=args.hidden_size, num_hidden_layers=args.num_hidden_layers,
- max_seq_len=args.max_seq_len, use_moe=bool(args.use_moe))
- model, tokenizer, vision_model, preprocess = init_model(lm_config)
- launch_gradio_server(server_name="0.0.0.0", server_port=8888)
diff --git a/trainer/train_pretrain_vlm.py b/trainer/train_pretrain_vlm.py
deleted file mode 100644
index 32b155d..0000000
--- a/trainer/train_pretrain_vlm.py
+++ /dev/null
@@ -1,172 +0,0 @@
-import os
-import sys
-
-__package__ = "trainer"
-sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), '..')))
-
-import argparse
-import time
-import warnings
-import torch
-import torch.distributed as dist
-from contextlib import nullcontext
-from torch import optim, nn
-from torch.nn.parallel import DistributedDataParallel
-from torch.utils.data import DataLoader, DistributedSampler
-from transformers import AutoTokenizer
-from model.model_vlm import MiniMindVLM, VLMConfig
-from dataset.lm_dataset import VLMDataset
-from trainer.trainer_utils import get_lr, Logger, is_main_process, init_distributed_mode, setup_seed, init_vlm_model, vlm_checkpoint, SkipBatchSampler
-
-warnings.filterwarnings('ignore')
-
-
-def train_epoch(epoch, loader, iters, start_step=0, wandb=None):
- loss_fct = nn.CrossEntropyLoss(reduction='none')
- start_time = time.time()
- for step, (X, Y, loss_mask, pixel_values) in enumerate(loader, start=start_step + 1):
- X = X.to(args.device)
- Y = Y.to(args.device)
- loss_mask = loss_mask.to(args.device)
- pixel_values = pixel_values.to(args.device)
- lr = get_lr(epoch * iters + step, args.epochs * iters, args.learning_rate)
- for param_group in optimizer.param_groups:
- param_group['lr'] = lr
-
- with autocast_ctx:
- res = model(X, pixel_values=pixel_values)
- loss = loss_fct(
- res.logits.view(-1, res.logits.size(-1)),
- Y.view(-1)
- ).view(Y.size())
-
- loss = (loss * loss_mask).sum() / loss_mask.sum()
- loss += res.aux_loss
- loss = loss / args.accumulation_steps
-
- scaler.scale(loss).backward()
-
- if (step + 1) % args.accumulation_steps == 0:
- scaler.unscale_(optimizer)
- torch.nn.utils.clip_grad_norm_(model.parameters(), args.grad_clip)
-
- scaler.step(optimizer)
- scaler.update()
-
- optimizer.zero_grad(set_to_none=True)
-
- if step % args.log_interval == 0 or step == iters - 1:
- spend_time = time.time() - start_time
- current_loss = loss.item() * args.accumulation_steps
- current_lr = optimizer.param_groups[-1]['lr']
- eta_min = spend_time / (step + 1) * iters // 60 - spend_time // 60
-
- Logger(f'Epoch:[{epoch+1}/{args.epochs}]({step}/{iters}) loss:{current_loss:.6f} lr:{current_lr:.12f} epoch_Time:{eta_min}min:')
-
- if wandb: wandb.log({"loss": current_loss, "lr": current_lr, "epoch_Time": eta_min})
-
- if (step % args.save_interval == 0 or step == iters - 1) and is_main_process():
- model.eval()
- moe_suffix = '_moe' if vlm_config.use_moe else ''
- ckp = f'{args.save_dir}/{args.save_weight}_{vlm_config.hidden_size}{moe_suffix}.pth'
- if isinstance(model, torch.nn.parallel.DistributedDataParallel):
- state_dict = model.module.state_dict()
- else:
- state_dict = model.state_dict()
- clean_state_dict = {
- key: value for key, value in state_dict.items() if not key.startswith('vision_encoder.')
- }
- clean_state_dict = {k: v.half() for k, v in clean_state_dict.items()} # 半精度保存
- torch.save(clean_state_dict, ckp)
- vlm_checkpoint(vlm_config, weight=args.save_weight, model=model, optimizer=optimizer,
- epoch=epoch, step=step, wandb=wandb, save_dir='../checkpoints', scaler=scaler)
- model.train()
-
-
-if __name__ == "__main__":
- parser = argparse.ArgumentParser(description="MiniMind-V Pretrain")
- parser.add_argument("--save_dir", type=str, default="../out", help="模型保存目录")
- parser.add_argument('--save_weight', default='pretrain_vlm', type=str, help="保存权重的前缀名")
- parser.add_argument("--epochs", type=int, default=4, help="训练轮数")
- parser.add_argument("--batch_size", type=int, default=16, help="batch size")
- parser.add_argument("--learning_rate", type=float, default=4e-4, help="初始学习率")
- parser.add_argument("--device", type=str, default="cuda:0" if torch.cuda.is_available() else "cpu", help="训练设备")
- parser.add_argument("--dtype", type=str, default="bfloat16", help="混合精度类型")
- parser.add_argument("--num_workers", type=int, default=8, help="数据加载线程数")
- parser.add_argument("--accumulation_steps", type=int, default=1, help="梯度累积步数")
- parser.add_argument("--grad_clip", type=float, default=1.0, help="梯度裁剪阈值")
- parser.add_argument("--log_interval", type=int, default=100, help="日志打印间隔")
- parser.add_argument("--save_interval", type=int, default=100, help="模型保存间隔")
- parser.add_argument('--hidden_size', default=512, type=int, help="隐藏层维度")
- parser.add_argument('--num_hidden_layers', default=8, type=int, help="隐藏层数量")
- parser.add_argument('--max_seq_len', default=640, type=int, help="训练的最大截断长度")
- parser.add_argument('--use_moe', default=0, type=int, choices=[0, 1], help="是否使用MoE架构(0=否,1=是)")
- parser.add_argument("--data_path", type=str, default="../dataset/pretrain_data.jsonl", help="训练数据路径")
- parser.add_argument("--images_path", type=str, default="../dataset/pretrain_images", help="训练图像路径")
- parser.add_argument('--from_weight', default='llm', type=str, help="基于哪个权重训练,为none则不基于任何权重训练")
- parser.add_argument('--from_resume', default=0, type=int, choices=[0, 1], help="是否自动检测&续训(0=否,1=是)")
- parser.add_argument('--freeze_llm', default=1, type=int, choices=[0, 1], help="是否冻结LLM参数(0=否,1=是,仅训练vision_proj)")
- parser.add_argument("--use_wandb", action="store_true", help="是否使用wandb")
- parser.add_argument("--wandb_project", type=str, default="MiniMind-V-Pretrain", help="wandb项目名")
- args = parser.parse_args()
-
- # ========== 1. 初始化环境和随机种子 ==========
- local_rank = init_distributed_mode()
- if dist.is_initialized(): args.device = f"cuda:{local_rank}"
- setup_seed(42 + (dist.get_rank() if dist.is_initialized() else 0))
-
- # ========== 2. 配置目录、模型参数、检查ckp ==========
- os.makedirs(args.save_dir, exist_ok=True)
- vlm_config = VLMConfig(hidden_size=args.hidden_size, num_hidden_layers=args.num_hidden_layers,
- max_seq_len=args.max_seq_len, use_moe=bool(args.use_moe))
- ckp_data = vlm_checkpoint(vlm_config, weight=args.save_weight, save_dir='../checkpoints') if args.from_resume==1 else None
-
- # ========== 3. 设置混合精度 ==========
- device_type = "cuda" if "cuda" in args.device else "cpu"
- dtype = torch.bfloat16 if args.dtype == "bfloat16" else torch.float16
- autocast_ctx = nullcontext() if device_type == "cpu" else torch.cuda.amp.autocast(dtype=dtype)
-
- # ========== 4. 配wandb ==========
- wandb = None
- if args.use_wandb and is_main_process():
- import swanlab as wandb
- wandb_id = ckp_data.get('wandb_id') if ckp_data else None
- resume = 'must' if wandb_id else None
- wandb_run_name = f"MiniMind-V-Pretrain-Epoch-{args.epochs}-BatchSize-{args.batch_size}-LearningRate-{args.learning_rate}"
- wandb.init(project=args.wandb_project, name=wandb_run_name, id=wandb_id, resume=resume)
-
- # ========== 5. 定义模型、数据、优化器 ==========
- model, tokenizer, preprocess = init_vlm_model(vlm_config, from_weight=args.from_weight,
- device=args.device, freeze_llm=bool(args.freeze_llm))
- train_ds = VLMDataset(args.data_path, args.images_path, tokenizer, preprocess=preprocess,
- image_special_token=vlm_config.image_special_token,
- max_length=vlm_config.max_seq_len)
- train_sampler = DistributedSampler(train_ds) if dist.is_initialized() else None
- scaler = torch.cuda.amp.GradScaler(enabled=(args.dtype == 'float16'))
- optimizer = optim.AdamW(filter(lambda p: p.requires_grad, model.parameters()), lr=args.learning_rate)
-
- # ========== 6. 从ckp恢复状态 ==========
- start_epoch, start_step = 0, 0
- if ckp_data:
- model.load_state_dict(ckp_data['model'], strict=False)
- optimizer.load_state_dict(ckp_data['optimizer'])
- scaler.load_state_dict(ckp_data['scaler'])
- start_epoch = ckp_data['epoch']
- start_step = ckp_data.get('step', 0)
-
- # ========== 7. DDP包模型 ==========
- if dist.is_initialized():
- model._ddp_params_and_buffers_to_ignore = {"pos_cis"}
- model = DistributedDataParallel(model, device_ids=[local_rank])
-
- # ========== 8. 开始训练 ==========
- for epoch in range(start_epoch, args.epochs):
- train_sampler and train_sampler.set_epoch(epoch)
- if epoch == start_epoch and start_step > 0: # 第一个epoch且存在检查点
- batch_sampler = SkipBatchSampler(train_sampler or range(len(train_ds)), args.batch_size, start_step + 1)
- loader = DataLoader(train_ds, batch_sampler=batch_sampler, num_workers=args.num_workers, pin_memory=True)
- Logger(f'Epoch [{epoch + 1}/{args.epochs}]: 跳过前{start_step}个step,从step {start_step + 1}开始')
- train_epoch(epoch, loader, len(loader) + start_step + 1, start_step, wandb)
- else: # 默认从头开始
- loader = DataLoader(train_ds, batch_size=args.batch_size, shuffle=(train_sampler is None), sampler=train_sampler, num_workers=args.num_workers, pin_memory=True)
- train_epoch(epoch, loader, len(loader), 0, wandb)
diff --git a/trainer/train_sft_vlm.py b/trainer/train_sft_vlm.py
deleted file mode 100644
index 253436e..0000000
--- a/trainer/train_sft_vlm.py
+++ /dev/null
@@ -1,171 +0,0 @@
-import os
-import sys
-
-__package__ = "trainer"
-sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), '..')))
-
-import argparse
-import time
-import warnings
-import torch
-import torch.distributed as dist
-from contextlib import nullcontext
-from torch import optim, nn
-from torch.nn.parallel import DistributedDataParallel
-from torch.utils.data import DataLoader, DistributedSampler
-from transformers import AutoTokenizer
-from model.model_vlm import MiniMindVLM, VLMConfig
-from dataset.lm_dataset import VLMDataset
-from trainer.trainer_utils import get_lr, Logger, is_main_process, init_distributed_mode, setup_seed, init_vlm_model, vlm_checkpoint, SkipBatchSampler
-
-warnings.filterwarnings('ignore')
-
-
-def train_epoch(epoch, loader, iters, start_step=0, wandb=None):
- loss_fct = nn.CrossEntropyLoss(reduction='none')
- start_time = time.time()
- for step, (X, Y, loss_mask, pixel_values) in enumerate(loader, start=start_step + 1):
- X = X.to(args.device)
- Y = Y.to(args.device)
- loss_mask = loss_mask.to(args.device)
- pixel_values = pixel_values.to(args.device)
- lr = get_lr(epoch * iters + step, args.epochs * iters, args.learning_rate)
- for param_group in optimizer.param_groups:
- param_group['lr'] = lr
-
- with autocast_ctx:
- res = model(X, pixel_values=pixel_values)
- loss = loss_fct(
- res.logits.view(-1, res.logits.size(-1)),
- Y.view(-1)
- ).view(Y.size())
-
- loss = (loss * loss_mask).sum() / loss_mask.sum()
- loss += res.aux_loss
- loss = loss / args.accumulation_steps
-
- scaler.scale(loss).backward()
-
- if (step + 1) % args.accumulation_steps == 0:
- scaler.unscale_(optimizer)
- torch.nn.utils.clip_grad_norm_(model.parameters(), args.grad_clip)
-
- scaler.step(optimizer)
- scaler.update()
-
- optimizer.zero_grad(set_to_none=True)
-
- if step % args.log_interval == 0 or step == iters - 1:
- spend_time = time.time() - start_time
- current_loss = loss.item() * args.accumulation_steps
- current_lr = optimizer.param_groups[-1]['lr']
- eta_min = spend_time / (step + 1) * iters // 60 - spend_time // 60
-
- Logger(f'Epoch:[{epoch+1}/{args.epochs}]({step}/{iters}) loss:{current_loss:.6f} lr:{current_lr:.12f} epoch_Time:{eta_min}min:')
-
- if wandb: wandb.log({"loss": current_loss, "lr": current_lr, "epoch_Time": eta_min})
-
- if (step % args.save_interval == 0 or step == iters - 1) and is_main_process():
- model.eval()
- moe_suffix = '_moe' if vlm_config.use_moe else ''
- ckp = f'{args.save_dir}/{args.save_weight}_{vlm_config.hidden_size}{moe_suffix}.pth'
- if isinstance(model, torch.nn.parallel.DistributedDataParallel):
- state_dict = model.module.state_dict()
- else:
- state_dict = model.state_dict()
- clean_state_dict = {
- key: value for key, value in state_dict.items() if not key.startswith('vision_encoder.')
- }
- clean_state_dict = {k: v.half() for k, v in clean_state_dict.items()} # 半精度保存
- torch.save(clean_state_dict, ckp)
- vlm_checkpoint(vlm_config, weight=args.save_weight, model=model, optimizer=optimizer,
- epoch=epoch, step=step, wandb=wandb, save_dir='../checkpoints', scaler=scaler)
- model.train()
-
-
-if __name__ == "__main__":
- parser = argparse.ArgumentParser(description="MiniMind-V SFT")
- parser.add_argument("--save_dir", type=str, default="../out", help="模型保存目录")
- parser.add_argument('--save_weight', default='sft_vlm', type=str, help="保存权重的前缀名")
- parser.add_argument("--epochs", type=int, default=2, help="训练轮数")
- parser.add_argument("--batch_size", type=int, default=4, help="batch size")
- parser.add_argument("--learning_rate", type=float, default=1e-6, help="初始学习率")
- parser.add_argument("--device", type=str, default="cuda:0" if torch.cuda.is_available() else "cpu", help="训练设备")
- parser.add_argument("--dtype", type=str, default="bfloat16", help="混合精度类型")
- parser.add_argument("--num_workers", type=int, default=8, help="数据加载线程数")
- parser.add_argument("--accumulation_steps", type=int, default=1, help="梯度累积步数")
- parser.add_argument("--grad_clip", type=float, default=1.0, help="梯度裁剪阈值")
- parser.add_argument("--log_interval", type=int, default=100, help="日志打印间隔")
- parser.add_argument("--save_interval", type=int, default=100, help="模型保存间隔")
- parser.add_argument('--hidden_size', default=512, type=int, help="隐藏层维度")
- parser.add_argument('--num_hidden_layers', default=8, type=int, help="隐藏层数量")
- parser.add_argument('--max_seq_len', default=1536, type=int, help="训练的最大截断长度")
- parser.add_argument('--use_moe', default=0, type=int, choices=[0, 1], help="是否使用MoE架构(0=否,1=是)")
- parser.add_argument("--data_path", type=str, default="../dataset/sft_data.jsonl", help="训练数据路径")
- parser.add_argument("--images_path", type=str, default="../dataset/sft_images", help="训练图像路径")
- parser.add_argument('--from_weight', default='pretrain_vlm', type=str, help="基于哪个权重训练,为none则不基于任何权重训练")
- parser.add_argument('--from_resume', default=0, type=int, choices=[0, 1], help="是否自动检测&续训(0=否,1=是)")
- parser.add_argument("--use_wandb", action="store_true", help="是否使用wandb")
- parser.add_argument("--wandb_project", type=str, default="MiniMind-V-SFT", help="wandb项目名")
- args = parser.parse_args()
-
- # ========== 1. 初始化环境和随机种子 ==========
- local_rank = init_distributed_mode()
- if dist.is_initialized(): args.device = f"cuda:{local_rank}"
- setup_seed(42 + (dist.get_rank() if dist.is_initialized() else 0))
-
- # ========== 2. 配置目录、模型参数、检查ckp ==========
- os.makedirs(args.save_dir, exist_ok=True)
- vlm_config = VLMConfig(hidden_size=args.hidden_size, num_hidden_layers=args.num_hidden_layers,
- max_seq_len=args.max_seq_len, use_moe=bool(args.use_moe))
- ckp_data = vlm_checkpoint(vlm_config, weight=args.save_weight, save_dir='../checkpoints') if args.from_resume==1 else None
-
- # ========== 3. 设置混合精度 ==========
- device_type = "cuda" if "cuda" in args.device else "cpu"
- dtype = torch.bfloat16 if args.dtype == "bfloat16" else torch.float16
- autocast_ctx = nullcontext() if device_type == "cpu" else torch.cuda.amp.autocast(dtype=dtype)
-
- # ========== 4. 配wandb ==========
- wandb = None
- if args.use_wandb and is_main_process():
- import swanlab as wandb
- wandb_id = ckp_data.get('wandb_id') if ckp_data else None
- resume = 'must' if wandb_id else None
- wandb_run_name = f"MiniMind-V-SFT-Epoch-{args.epochs}-BatchSize-{args.batch_size}-LearningRate-{args.learning_rate}"
- wandb.init(project=args.wandb_project, name=wandb_run_name, id=wandb_id, resume=resume)
-
- # ========== 5. 定义模型、数据、优化器 ==========
- model, tokenizer, preprocess = init_vlm_model(vlm_config, from_weight=args.from_weight,
- device=args.device)
- train_ds = VLMDataset(args.data_path, args.images_path, tokenizer, preprocess=preprocess,
- image_special_token=vlm_config.image_special_token,
- max_length=vlm_config.max_seq_len)
- train_sampler = DistributedSampler(train_ds) if dist.is_initialized() else None
- scaler = torch.cuda.amp.GradScaler(enabled=(args.dtype == 'float16'))
- optimizer = optim.AdamW(model.parameters(), lr=args.learning_rate)
-
- # ========== 6. 从ckp恢复状态 ==========
- start_epoch, start_step = 0, 0
- if ckp_data:
- model.load_state_dict(ckp_data['model'], strict=False)
- optimizer.load_state_dict(ckp_data['optimizer'])
- scaler.load_state_dict(ckp_data['scaler'])
- start_epoch = ckp_data['epoch']
- start_step = ckp_data.get('step', 0)
-
- # ========== 7. DDP包模型 ==========
- if dist.is_initialized():
- model._ddp_params_and_buffers_to_ignore = {"pos_cis"}
- model = DistributedDataParallel(model, device_ids=[local_rank])
-
- # ========== 8. 开始训练 ==========
- for epoch in range(start_epoch, args.epochs):
- train_sampler and train_sampler.set_epoch(epoch)
- if epoch == start_epoch and start_step > 0: # 第一个epoch且存在检查点
- batch_sampler = SkipBatchSampler(train_sampler or range(len(train_ds)), args.batch_size, start_step + 1)
- loader = DataLoader(train_ds, batch_sampler=batch_sampler, num_workers=args.num_workers, pin_memory=True)
- Logger(f'Epoch [{epoch + 1}/{args.epochs}]: 跳过前{start_step}个step,从step {start_step + 1}开始')
- train_epoch(epoch, loader, len(loader) + start_step + 1, start_step, wandb)
- else: # 默认从头开始
- loader = DataLoader(train_ds, batch_size=args.batch_size, shuffle=(train_sampler is None), sampler=train_sampler, num_workers=args.num_workers, pin_memory=True)
- train_epoch(epoch, loader, len(loader), 0, wandb)
diff --git a/trainer/trainer_utils.py b/trainer/trainer_utils.py
deleted file mode 100644
index 9d89475..0000000
--- a/trainer/trainer_utils.py
+++ /dev/null
@@ -1,158 +0,0 @@
-"""
-训练工具函数集合
-"""
-import os
-import random
-import math
-import numpy as np
-import torch
-import torch.distributed as dist
-from torch.utils.data import Sampler
-from transformers import AutoTokenizer
-from model.model_vlm import MiniMindVLM
-
-
-
-def is_main_process():
- return not dist.is_initialized() or dist.get_rank() == 0
-
-
-def Logger(content):
- if is_main_process():
- print(content)
-
-
-def get_lr(current_step, total_steps, lr):
- return lr / 10 + 0.5 * lr * (1 + math.cos(math.pi * current_step / total_steps))
-
-
-def init_distributed_mode():
- if int(os.environ.get("RANK", -1)) == -1:
- return 0 # 非DDP模式
-
- dist.init_process_group(backend="nccl")
- local_rank = int(os.environ["LOCAL_RANK"])
- torch.cuda.set_device(local_rank)
- return local_rank
-
-
-def setup_seed(seed: int):
- random.seed(seed)
- np.random.seed(seed)
- torch.manual_seed(seed)
- torch.cuda.manual_seed(seed)
- torch.cuda.manual_seed_all(seed)
- torch.backends.cudnn.deterministic = True
- torch.backends.cudnn.benchmark = False
-
-
-def init_vlm_model(vlm_config, from_weight='pretrain_vlm', tokenizer_path='../model',
- vision_model_path='../model/vision_model/clip-vit-base-patch16',
- save_dir='../out', device='cuda', freeze_llm=False):
- tokenizer = AutoTokenizer.from_pretrained(tokenizer_path)
- model = MiniMindVLM(vlm_config, vision_model_path=vision_model_path)
-
- if from_weight != 'none':
- moe_suffix = '_moe' if vlm_config.use_moe else ''
- weight_path = f'{save_dir}/{from_weight}_{vlm_config.hidden_size}{moe_suffix}.pth'
- weights = torch.load(weight_path, map_location=device)
- model.load_state_dict(weights, strict=False)
-
- # Pretrain阶段:冻结除 vision_proj 外的所有参数
- if freeze_llm:
- for name, param in model.named_parameters():
- if 'vision_proj' not in name:
- param.requires_grad = False
-
- # 默认全参训练时的可选配置(已注释)
- # # 只解冻注意力机制中的投影层参数
- # for name, param in model.model.named_parameters():
- # if any(proj in name for proj in ['q_proj', 'k_proj', 'v_proj', 'o_proj']):
- # param.requires_grad = True
-
- Logger(f'所加载VLM Model可训练参数:{sum(p.numel() for p in model.parameters() if p.requires_grad) / 1e6:.3f} 百万')
- preprocess = model.processor
- return model.to(device), tokenizer, preprocess
-
-
-def vlm_checkpoint(vlm_config, weight='pretrain_vlm', model=None, optimizer=None, epoch=0, step=0, wandb=None, save_dir='../checkpoints', **kwargs):
- os.makedirs(save_dir, exist_ok=True)
- moe_path = '_moe' if vlm_config.use_moe else ''
- ckp_path = f'{save_dir}/{weight}_{vlm_config.hidden_size}{moe_path}.pth'
- resume_path = f'{save_dir}/{weight}_{vlm_config.hidden_size}{moe_path}_resume.pth'
-
- if model is not None:
- from torch.nn.parallel import DistributedDataParallel
- state_dict = model.module.state_dict() if isinstance(model, DistributedDataParallel) else model.state_dict()
- # 移除vision_encoder参数(不需要保存,因为是预训练的)
- clean_state_dict = {k: v for k, v in state_dict.items() if not k.startswith('vision_encoder.')}
- ckp_tmp = ckp_path + '.tmp'
- torch.save({k: v.half() for k, v in clean_state_dict.items()}, ckp_tmp)
- os.replace(ckp_tmp, ckp_path)
-
- wandb_id = None
- if wandb:
- if hasattr(wandb, 'get_run'):
- run = wandb.get_run()
- wandb_id = getattr(run, 'id', None) if run else None
- else:
- wandb_id = getattr(wandb, 'id', None)
-
- resume_data = {
- 'model': state_dict,
- 'optimizer': optimizer.state_dict(),
- 'epoch': epoch,
- 'step': step,
- 'world_size': dist.get_world_size() if dist.is_initialized() else 1,
- 'wandb_id': wandb_id
- }
- for key, value in kwargs.items():
- if value is not None:
- if hasattr(value, 'state_dict'):
- if isinstance(value, DistributedDataParallel):
- resume_data[key] = value.module.state_dict()
- else:
- resume_data[key] = value.state_dict()
- else:
- resume_data[key] = value
-
- resume_tmp = resume_path + '.tmp'
- torch.save(resume_data, resume_tmp)
- os.replace(resume_tmp, resume_path)
- else: # 加载模式
- if os.path.exists(resume_path):
- ckp_data = torch.load(resume_path, map_location='cpu')
- saved_ws = ckp_data.get('world_size', 1)
- current_ws = dist.get_world_size() if dist.is_initialized() else 1
- if saved_ws != current_ws:
- ckp_data['step'] = ckp_data['step'] * saved_ws // current_ws
- Logger(f'GPU数量变化({saved_ws}→{current_ws}),step已自动转换为{ckp_data["step"]}')
- return ckp_data
- return None
-
-
-class SkipBatchSampler(Sampler):
- def __init__(self, sampler, batch_size, skip_batches=0):
- self.sampler = sampler
- self.batch_size = batch_size
- self.skip_batches = skip_batches
-
- def __iter__(self):
- batch = []
- skipped = 0
- for idx in self.sampler:
- batch.append(idx)
- if len(batch) == self.batch_size:
- if skipped < self.skip_batches:
- skipped += 1
- batch = []
- continue
- yield batch
- batch = []
- if len(batch) > 0 and skipped >= self.skip_batches:
- yield batch
-
- def __len__(self):
- total_batches = (len(self.sampler) + self.batch_size - 1) // self.batch_size
- return max(0, total_batches - self.skip_batches)
-