messages.GetUnreadMentions
Get unread messages where we were mentioned
Example
const { Api, TelegramClient } = require("telegram");
const { StringSession } = require("telegram/sessions");
const session = new StringSession(""); // You should put your string session here
const client = new TelegramClient(session, apiId, apiHash, {});
(async function run() {
await client.connect(); // This assumes you have already authenticated with .start()
const result = await client.invoke(
new Api.messages.GetUnreadMentions({
peer: "username",
offsetId: 43,
addOffset: 0,
limit: 100,
maxId: 0,
minId: 0,
})
);
console.log(result); // prints the result
})();
import { Api, TelegramClient } from "telegram";
import { StringSession } from "telegram/sessions";
const session = new StringSession(""); // You should put your string session here
const client = new TelegramClient(session, apiId, apiHash, {});
(async function run() {
await client.connect(); // This assumes you have already authenticated with .start()
const result: Api.messages.Messages = await client.invoke(
new Api.messages.GetUnreadMentions({
peer: "username",
offsetId: 43,
addOffset: 0,
limit: 100,
maxId: 0,
minId: 0,
})
);
console.log(result); // prints the result
})();
Parameters
Name | Type | Description |
---|---|---|
peer | InputPeer | Peer where to look for mentions |
offsetId | int | Offsets for pagination, for more info click here |
addOffset | int | Offsets for pagination, for more info click here |
limit | int | Maximum number of results to return, see pagination |
maxId | int | Maximum message ID to return, see pagination |
minId | int | Minimum message ID to return, see pagination |
Result
Possible errors
Code | Type | Description |
---|---|---|
400 | CHANNEL_INVALID | The provided channel is invalid. |
400 | CHANNEL_PRIVATE | You haven't joined this channel/supergroup. |
400 | MSG_ID_INVALID | Invalid message ID provided. |
400 | PEER_ID_INVALID | The provided peer id is invalid. |
Can bots use this method?
No
Related pages
Pagination in the API
How to fetch results from large lists of objects.